Feb 19 19:18:19 crc systemd[1]: Starting Kubernetes Kubelet... Feb 19 19:18:19 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:19 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 19:18:20 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 19 19:18:20 crc kubenswrapper[4722]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 19:18:20 crc kubenswrapper[4722]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 19 19:18:20 crc kubenswrapper[4722]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 19:18:20 crc kubenswrapper[4722]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 19:18:20 crc kubenswrapper[4722]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 19 19:18:20 crc kubenswrapper[4722]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.834512 4722 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847187 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847219 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847232 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847243 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847257 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847268 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847278 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847289 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847298 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847307 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847317 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847326 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847334 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847344 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847363 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847375 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847385 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847397 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847408 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847419 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847430 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847442 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847452 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847463 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847474 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847485 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847496 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847506 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847517 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847528 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847539 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847550 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847559 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847568 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847577 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847589 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847599 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847608 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847617 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847625 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847821 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847831 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847839 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847848 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847856 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847864 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847873 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847881 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847889 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847897 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847908 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847916 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847926 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847934 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847942 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847951 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847959 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847968 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847976 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.847995 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848003 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848011 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848020 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848028 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848037 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848045 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848053 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848062 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848074 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848084 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.848094 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848289 4722 flags.go:64] FLAG: --address="0.0.0.0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848307 4722 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848323 4722 flags.go:64] FLAG: --anonymous-auth="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848335 4722 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848347 4722 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848358 4722 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848370 4722 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848382 4722 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848392 4722 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848402 4722 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848413 4722 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848424 4722 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848434 4722 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848447 4722 flags.go:64] FLAG: --cgroup-root="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848457 4722 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848468 4722 flags.go:64] FLAG: --client-ca-file="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848478 4722 flags.go:64] FLAG: --cloud-config="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848488 4722 flags.go:64] FLAG: --cloud-provider="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848498 4722 flags.go:64] FLAG: --cluster-dns="[]" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848509 4722 flags.go:64] FLAG: --cluster-domain="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848518 4722 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848529 4722 flags.go:64] FLAG: --config-dir="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848538 4722 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848549 4722 flags.go:64] FLAG: --container-log-max-files="5" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848560 4722 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848570 4722 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848581 4722 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848591 4722 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848602 4722 flags.go:64] FLAG: --contention-profiling="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848612 4722 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848621 4722 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848632 4722 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848641 4722 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848654 4722 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848664 4722 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848673 4722 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848683 4722 flags.go:64] FLAG: --enable-load-reader="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848693 4722 flags.go:64] FLAG: --enable-server="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848703 4722 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848716 4722 flags.go:64] FLAG: --event-burst="100" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848726 4722 flags.go:64] FLAG: --event-qps="50" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848736 4722 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848746 4722 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848756 4722 flags.go:64] FLAG: --eviction-hard="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848768 4722 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848778 4722 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848788 4722 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848798 4722 flags.go:64] FLAG: --eviction-soft="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848808 4722 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848818 4722 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848828 4722 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848839 4722 flags.go:64] FLAG: --experimental-mounter-path="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848849 4722 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848860 4722 flags.go:64] FLAG: --fail-swap-on="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848870 4722 flags.go:64] FLAG: --feature-gates="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848882 4722 flags.go:64] FLAG: --file-check-frequency="20s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848892 4722 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848902 4722 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848912 4722 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848922 4722 flags.go:64] FLAG: --healthz-port="10248" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848932 4722 flags.go:64] FLAG: --help="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848942 4722 flags.go:64] FLAG: --hostname-override="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848951 4722 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848961 4722 flags.go:64] FLAG: --http-check-frequency="20s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848971 4722 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848981 4722 flags.go:64] FLAG: --image-credential-provider-config="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.848991 4722 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849001 4722 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849010 4722 flags.go:64] FLAG: --image-service-endpoint="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849020 4722 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849030 4722 flags.go:64] FLAG: --kube-api-burst="100" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849040 4722 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849050 4722 flags.go:64] FLAG: --kube-api-qps="50" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849060 4722 flags.go:64] FLAG: --kube-reserved="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849069 4722 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849079 4722 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849089 4722 flags.go:64] FLAG: --kubelet-cgroups="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849100 4722 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849109 4722 flags.go:64] FLAG: --lock-file="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849119 4722 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849129 4722 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849139 4722 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849178 4722 flags.go:64] FLAG: --log-json-split-stream="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849188 4722 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849198 4722 flags.go:64] FLAG: --log-text-split-stream="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849208 4722 flags.go:64] FLAG: --logging-format="text" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849218 4722 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849863 4722 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849876 4722 flags.go:64] FLAG: --manifest-url="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849886 4722 flags.go:64] FLAG: --manifest-url-header="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849899 4722 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849910 4722 flags.go:64] FLAG: --max-open-files="1000000" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849922 4722 flags.go:64] FLAG: --max-pods="110" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849932 4722 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849942 4722 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849952 4722 flags.go:64] FLAG: --memory-manager-policy="None" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849962 4722 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849972 4722 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849982 4722 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.849992 4722 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850012 4722 flags.go:64] FLAG: --node-status-max-images="50" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850022 4722 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850032 4722 flags.go:64] FLAG: --oom-score-adj="-999" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850042 4722 flags.go:64] FLAG: --pod-cidr="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850052 4722 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850065 4722 flags.go:64] FLAG: --pod-manifest-path="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850075 4722 flags.go:64] FLAG: --pod-max-pids="-1" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850085 4722 flags.go:64] FLAG: --pods-per-core="0" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850095 4722 flags.go:64] FLAG: --port="10250" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850106 4722 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850117 4722 flags.go:64] FLAG: --provider-id="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850127 4722 flags.go:64] FLAG: --qos-reserved="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850137 4722 flags.go:64] FLAG: --read-only-port="10255" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850147 4722 flags.go:64] FLAG: --register-node="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850183 4722 flags.go:64] FLAG: --register-schedulable="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850193 4722 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850217 4722 flags.go:64] FLAG: --registry-burst="10" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850227 4722 flags.go:64] FLAG: --registry-qps="5" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850236 4722 flags.go:64] FLAG: --reserved-cpus="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850281 4722 flags.go:64] FLAG: --reserved-memory="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850295 4722 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850305 4722 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850316 4722 flags.go:64] FLAG: --rotate-certificates="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850327 4722 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850337 4722 flags.go:64] FLAG: --runonce="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850348 4722 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850358 4722 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850369 4722 flags.go:64] FLAG: --seccomp-default="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850378 4722 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850388 4722 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850398 4722 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850409 4722 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850419 4722 flags.go:64] FLAG: --storage-driver-password="root" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850429 4722 flags.go:64] FLAG: --storage-driver-secure="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850439 4722 flags.go:64] FLAG: --storage-driver-table="stats" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850449 4722 flags.go:64] FLAG: --storage-driver-user="root" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850459 4722 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850470 4722 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850480 4722 flags.go:64] FLAG: --system-cgroups="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850489 4722 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850504 4722 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850515 4722 flags.go:64] FLAG: --tls-cert-file="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850525 4722 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850536 4722 flags.go:64] FLAG: --tls-min-version="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850546 4722 flags.go:64] FLAG: --tls-private-key-file="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850555 4722 flags.go:64] FLAG: --topology-manager-policy="none" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850565 4722 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850575 4722 flags.go:64] FLAG: --topology-manager-scope="container" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850585 4722 flags.go:64] FLAG: --v="2" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850597 4722 flags.go:64] FLAG: --version="false" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850610 4722 flags.go:64] FLAG: --vmodule="" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850621 4722 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.850631 4722 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850840 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850851 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850861 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850869 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850878 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850887 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850896 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850905 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850914 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850922 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850931 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850939 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850948 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850957 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850965 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850974 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850982 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.850991 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851003 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851014 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851026 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851036 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851045 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851055 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851064 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851072 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851081 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851092 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851103 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851112 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851122 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851132 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851142 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851175 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851187 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851198 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851208 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851217 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851226 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851234 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851243 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851251 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851261 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851270 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851279 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851287 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851296 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851305 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851314 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851322 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851330 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851340 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851352 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851362 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851374 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851385 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851396 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851410 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851424 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851436 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851449 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851459 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851469 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851478 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851487 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851495 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851504 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851513 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851521 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851530 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.851538 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.852530 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.864869 4722 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.864921 4722 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865055 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865076 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865087 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865098 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865107 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865117 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865126 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865134 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865143 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865184 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865197 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865208 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865218 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865227 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865235 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865244 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865253 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865262 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865270 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865279 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865288 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865296 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865304 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865313 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865322 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865330 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865339 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865348 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865356 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865364 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865373 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865382 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865391 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865416 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865425 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865434 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865443 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865454 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865465 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865474 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865483 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865495 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865508 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865518 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865528 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865538 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865548 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865558 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865567 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865576 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865584 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865593 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865602 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865610 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865619 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865628 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865636 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865645 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865654 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865662 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865671 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865679 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865694 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865706 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865717 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865727 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865736 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865745 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865753 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865762 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.865770 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.865783 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866032 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866046 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866056 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866066 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866075 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866088 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866100 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866110 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866121 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866131 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866142 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866265 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866299 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866313 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866326 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866338 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866348 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866357 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866366 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866377 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866386 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866395 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866406 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866417 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866425 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866433 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866440 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866447 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866453 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866458 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866464 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866470 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866475 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866480 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866485 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866490 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866495 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866500 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866505 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866510 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866515 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866520 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866526 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866531 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866536 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866541 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866546 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866552 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866559 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866564 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866570 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866575 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866580 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866586 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866592 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866597 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866602 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866607 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866612 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866619 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866625 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866630 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866634 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866640 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866645 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866650 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866655 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866660 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866665 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866670 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 19:18:20 crc kubenswrapper[4722]: W0219 19:18:20.866675 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.866684 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.866863 4722 server.go:940] "Client rotation is on, will bootstrap in background" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.871911 4722 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.871999 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.873976 4722 server.go:997] "Starting client certificate rotation" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.874011 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.874336 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-16 17:21:27.414657228 +0000 UTC Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.874463 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.902566 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 19:18:20 crc kubenswrapper[4722]: E0219 19:18:20.903950 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.907545 4722 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.923491 4722 log.go:25] "Validated CRI v1 runtime API" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.955335 4722 log.go:25] "Validated CRI v1 image API" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.957205 4722 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.963116 4722 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-19-19-13-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.963167 4722 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.985224 4722 manager.go:217] Machine: {Timestamp:2026-02-19 19:18:20.981352017 +0000 UTC m=+0.593702421 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4cf2b762-873e-4422-8170-f24281d6b9fa BootID:7bf15e34-a3dc-4bfd-a83d-49c3d07d7868 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0e:2c:9c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0e:2c:9c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:65:42:da Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9a:c1:29 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:32:c5:05 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5b:72:20 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:62:7a:55:18:df Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:8c:b7:5e:d8:6f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.985620 4722 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.986036 4722 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.988642 4722 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.988961 4722 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.989023 4722 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.989376 4722 topology_manager.go:138] "Creating topology manager with none policy" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.989393 4722 container_manager_linux.go:303] "Creating device plugin manager" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.990112 4722 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.990167 4722 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.991383 4722 state_mem.go:36] "Initialized new in-memory state store" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.991519 4722 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.996670 4722 kubelet.go:418] "Attempting to sync node with API server" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.996704 4722 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.996743 4722 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.996766 4722 kubelet.go:324] "Adding apiserver pod source" Feb 19 19:18:20 crc kubenswrapper[4722]: I0219 19:18:20.996785 4722 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.002595 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.002725 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.002984 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.003088 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.006040 4722 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.007682 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.011222 4722 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013248 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013300 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013316 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013330 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013354 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013368 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013382 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013405 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013423 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013438 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013459 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.013474 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.014540 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.015330 4722 server.go:1280] "Started kubelet" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.016089 4722 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.016248 4722 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.016640 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.017377 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.017405 4722 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.017524 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 14:47:29.764661659 +0000 UTC Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.017684 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.017782 4722 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.018015 4722 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 19 19:18:21 crc systemd[1]: Started Kubernetes Kubelet. Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.018690 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.017801 4722 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.020291 4722 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.018813 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.020640 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.020881 4722 factory.go:55] Registering systemd factory Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.020912 4722 factory.go:221] Registration of the systemd container factory successfully Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.021285 4722 server.go:460] "Adding debug handlers to kubelet server" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.021620 4722 factory.go:153] Registering CRI-O factory Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.021674 4722 factory.go:221] Registration of the crio container factory successfully Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.021770 4722 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.021798 4722 factory.go:103] Registering Raw factory Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.021823 4722 manager.go:1196] Started watching for new ooms in manager Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.021700 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895bbf7b56cd744 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:18:21.015283524 +0000 UTC m=+0.627633888,LastTimestamp:2026-02-19 19:18:21.015283524 +0000 UTC m=+0.627633888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.025447 4722 manager.go:319] Starting recovery of all containers Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.043986 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.044461 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.044680 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.044868 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045692 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045755 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045787 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045819 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045855 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045891 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045920 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045949 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.045979 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046018 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046050 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046078 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046109 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046135 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046207 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046237 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046268 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046298 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046342 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046376 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046405 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046433 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046470 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046502 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046530 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046558 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046585 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046614 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046707 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046735 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046760 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046791 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046817 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046843 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046868 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046896 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046921 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046948 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.046975 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047005 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047033 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047065 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047094 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047125 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047192 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047255 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047285 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047312 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047348 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047379 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047410 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047440 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047468 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047517 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047545 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047572 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047598 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047626 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047651 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047678 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047707 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047743 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047771 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047797 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047825 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047853 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047879 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047905 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047933 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047967 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.047996 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048025 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048052 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048078 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048106 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048131 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048195 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048224 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048252 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048284 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048310 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048335 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048362 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048389 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048415 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048443 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048471 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048503 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048530 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048555 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048583 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048612 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048640 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048666 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048692 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048720 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048744 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048776 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048802 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048828 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048878 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048912 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048943 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.048975 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049006 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049036 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049066 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049094 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049126 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049196 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049232 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049260 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049289 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049315 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049344 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049380 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049410 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049439 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049464 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049491 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049517 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049569 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049596 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049622 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049650 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049677 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049704 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049728 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049755 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049788 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049813 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049838 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049867 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049894 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049923 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049951 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.049978 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050004 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050029 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050053 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050082 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050108 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050135 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050204 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050232 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050263 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050289 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050324 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050350 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050377 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050402 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050431 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050457 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050485 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050602 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050631 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050658 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050685 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050713 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050739 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050767 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050802 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050829 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050855 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050883 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050909 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050937 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050963 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.050990 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051016 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051041 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051069 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051108 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051134 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051196 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051226 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.051252 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.052816 4722 manager.go:324] Recovery completed Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054456 4722 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054522 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054549 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054574 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054597 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054618 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054637 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054658 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054678 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054697 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054720 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054749 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054778 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054807 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054834 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.054861 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055530 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055567 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055590 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055611 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055636 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055654 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055675 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055695 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055716 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055734 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055753 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055773 4722 reconstruct.go:97] "Volume reconstruction finished" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.055787 4722 reconciler.go:26] "Reconciler: start to sync state" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.064443 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.065883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.065982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.066004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.066659 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.068694 4722 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.068726 4722 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.068754 4722 state_mem.go:36] "Initialized new in-memory state store" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.069965 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.070005 4722 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.070033 4722 kubelet.go:2335] "Starting kubelet main sync loop" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.070168 4722 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.073085 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.073189 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.082772 4722 policy_none.go:49] "None policy: Start" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.084121 4722 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.084191 4722 state_mem.go:35] "Initializing new in-memory state store" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.119380 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.137606 4722 manager.go:334] "Starting Device Plugin manager" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.137666 4722 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.137681 4722 server.go:79] "Starting device plugin registration server" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.138120 4722 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.138142 4722 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.138509 4722 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.138581 4722 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.138589 4722 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.147420 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.171303 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.171457 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.173014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.173090 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.173109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.173495 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.173598 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.173638 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.174806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.175186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.175198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.175977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.176000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.176010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.176129 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.176300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.176342 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.177397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.177428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.177440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.178500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.178530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.178541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.178645 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.178905 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.178988 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.179243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.179272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.179284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.179409 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.179562 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.179600 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180040 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180229 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.180996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.181029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.181073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.221703 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.238450 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.239521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.239557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.239566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.239592 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.240051 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.258824 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.258875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.258901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.258934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.258975 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.258999 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259271 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259336 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259657 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.259714 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361092 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.361579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362379 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362495 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362507 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362537 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362561 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.362608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.363217 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.440526 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.441752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.441798 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.441818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.441851 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.442356 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.525022 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.535193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.541789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.571378 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-dccce276e710a75acd9cf3106cb4f15cdf5074f0b392f4762bba61faf631d1a3 WatchSource:0}: Error finding container dccce276e710a75acd9cf3106cb4f15cdf5074f0b392f4762bba61faf631d1a3: Status 404 returned error can't find the container with id dccce276e710a75acd9cf3106cb4f15cdf5074f0b392f4762bba61faf631d1a3 Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.573829 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-05a61350794a7859ba6fdc39b19f1eabce56c6b2ea5e0d2a491a973959f87ae0 WatchSource:0}: Error finding container 05a61350794a7859ba6fdc39b19f1eabce56c6b2ea5e0d2a491a973959f87ae0: Status 404 returned error can't find the container with id 05a61350794a7859ba6fdc39b19f1eabce56c6b2ea5e0d2a491a973959f87ae0 Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.579181 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-29d63cfec71a63cdf704739e0d3fcb40b49bf4b5b98a93efa0fe93d552075b7d WatchSource:0}: Error finding container 29d63cfec71a63cdf704739e0d3fcb40b49bf4b5b98a93efa0fe93d552075b7d: Status 404 returned error can't find the container with id 29d63cfec71a63cdf704739e0d3fcb40b49bf4b5b98a93efa0fe93d552075b7d Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.581743 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.588030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.613863 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-bc68dddb17fb24debba9d3e5c1a5cdb2cd3330aca8b9055a35a243188af78124 WatchSource:0}: Error finding container bc68dddb17fb24debba9d3e5c1a5cdb2cd3330aca8b9055a35a243188af78124: Status 404 returned error can't find the container with id bc68dddb17fb24debba9d3e5c1a5cdb2cd3330aca8b9055a35a243188af78124 Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.622120 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Feb 19 19:18:21 crc kubenswrapper[4722]: W0219 19:18:21.622550 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a0d45561891464a0f6435f16cd20d5bcdc28204e20fe4e2e115661c31ea7d7f8 WatchSource:0}: Error finding container a0d45561891464a0f6435f16cd20d5bcdc28204e20fe4e2e115661c31ea7d7f8: Status 404 returned error can't find the container with id a0d45561891464a0f6435f16cd20d5bcdc28204e20fe4e2e115661c31ea7d7f8 Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.842578 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.844482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.844557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.844568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:21 crc kubenswrapper[4722]: I0219 19:18:21.844595 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:21 crc kubenswrapper[4722]: E0219 19:18:21.845107 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.017815 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 22:50:23.281279063 +0000 UTC Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.018227 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.076856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"29d63cfec71a63cdf704739e0d3fcb40b49bf4b5b98a93efa0fe93d552075b7d"} Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.077893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"05a61350794a7859ba6fdc39b19f1eabce56c6b2ea5e0d2a491a973959f87ae0"} Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.078773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dccce276e710a75acd9cf3106cb4f15cdf5074f0b392f4762bba61faf631d1a3"} Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.079658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0d45561891464a0f6435f16cd20d5bcdc28204e20fe4e2e115661c31ea7d7f8"} Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.080445 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bc68dddb17fb24debba9d3e5c1a5cdb2cd3330aca8b9055a35a243188af78124"} Feb 19 19:18:22 crc kubenswrapper[4722]: W0219 19:18:22.157308 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:22 crc kubenswrapper[4722]: E0219 19:18:22.157422 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:22 crc kubenswrapper[4722]: W0219 19:18:22.191816 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:22 crc kubenswrapper[4722]: E0219 19:18:22.191937 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:22 crc kubenswrapper[4722]: W0219 19:18:22.337592 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:22 crc kubenswrapper[4722]: E0219 19:18:22.337964 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:22 crc kubenswrapper[4722]: E0219 19:18:22.423426 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Feb 19 19:18:22 crc kubenswrapper[4722]: W0219 19:18:22.523383 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:22 crc kubenswrapper[4722]: E0219 19:18:22.523687 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.645333 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.646328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.646368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.646414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:22 crc kubenswrapper[4722]: I0219 19:18:22.646438 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:22 crc kubenswrapper[4722]: E0219 19:18:22.646813 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.008091 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 19:18:23 crc kubenswrapper[4722]: E0219 19:18:23.009584 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.017757 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.018719 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 01:19:00.682892667 +0000 UTC Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.088401 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.088463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.088485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.088504 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.088611 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.090340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.090394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.090418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.092569 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" exitCode=0 Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.092676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.092687 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.093794 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.093836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.093848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.094517 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9576b79b5daa6ef6792e842b3243eaaf778bb3861123e19f854445c0fccfa92a" exitCode=0 Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.094623 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.094638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9576b79b5daa6ef6792e842b3243eaaf778bb3861123e19f854445c0fccfa92a"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.095519 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.095922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.095967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.095981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.096632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.096658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.096669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.097904 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac" exitCode=0 Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.097977 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.098031 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.099594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.099633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.099650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.100704 4722 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8" exitCode=0 Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.100740 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8"} Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.100874 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.101969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.102004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:23 crc kubenswrapper[4722]: I0219 19:18:23.102022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.018483 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.018834 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 03:42:47.643123427 +0000 UTC Feb 19 19:18:24 crc kubenswrapper[4722]: E0219 19:18:24.024322 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.107388 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6e95d72e36f16169cc19cb62e52f3c7d289097c9b5d3373b2ad6a888847377fd" exitCode=0 Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.107571 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.107728 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6e95d72e36f16169cc19cb62e52f3c7d289097c9b5d3373b2ad6a888847377fd"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.108454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.108488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.108501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.116878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.116984 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.118392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.118444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.118469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.121913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.121982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.122012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.122220 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.123782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.123833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.123857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.130402 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.130456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.130505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.130538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.130548 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a"} Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.138547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.138609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.138625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:24 crc kubenswrapper[4722]: W0219 19:18:24.198600 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:24 crc kubenswrapper[4722]: E0219 19:18:24.198671 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.247962 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.249296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.249342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.249352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:24 crc kubenswrapper[4722]: I0219 19:18:24.249381 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:24 crc kubenswrapper[4722]: E0219 19:18:24.249836 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Feb 19 19:18:24 crc kubenswrapper[4722]: W0219 19:18:24.624023 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Feb 19 19:18:24 crc kubenswrapper[4722]: E0219 19:18:24.624120 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.018995 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 05:55:37.207907346 +0000 UTC Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.139521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012"} Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.139973 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.141816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.142000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.142133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.147248 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="943bb3fd1050eb6f4da7e5f6d6ec86679521bccd95137cefdc5063ba0e2c3ed8" exitCode=0 Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.147349 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"943bb3fd1050eb6f4da7e5f6d6ec86679521bccd95137cefdc5063ba0e2c3ed8"} Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.147411 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.147453 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.147545 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.147463 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.149415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.149460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.149478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.149529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.149568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.149586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.150103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.150222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.150250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.161054 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.161268 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.162489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.162562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.162588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:25 crc kubenswrapper[4722]: I0219 19:18:25.373733 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.019473 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 14:19:36.679837879 +0000 UTC Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.157933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a880d3be894b1e91810637f76b23349834b9209422a39851be7dc13fe2c6b9f6"} Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.157988 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"87d761b0d41e60c04122827d2b76165abeb17e10d8ac37c4e29d52ec4525b1e9"} Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.158003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c17a8ee8e85743452631d50f129d70bdf58b3720e486c71ab77a05a51495be15"} Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.158018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"13f97a8ea2a71958dee6fd6d349cdca004bf18aab94b1b9af91032f3f22f8925"} Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.158031 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.158091 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.159480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.159540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.159561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.326120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.326385 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.328354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.328431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.328451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:26 crc kubenswrapper[4722]: I0219 19:18:26.446695 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.019687 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 08:08:18.231686255 +0000 UTC Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.165687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc065fe769d9281fb675e614d460636fe2adead8401263e77d22e12a0f207743"} Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.165718 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.165805 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.165812 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.167502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.167546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.167564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.167512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.167703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.167728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.337135 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.450902 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.452373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.452420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.452437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.452470 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:27 crc kubenswrapper[4722]: I0219 19:18:27.785046 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.020548 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 21:09:42.201584733 +0000 UTC Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.171997 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.172035 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.172084 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.174008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.174065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.174078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.174091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.174105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.174107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.753625 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:28 crc kubenswrapper[4722]: I0219 19:18:28.795941 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.021458 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 20:39:28.19501822 +0000 UTC Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.174276 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.174360 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.175523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.175557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.175569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.176453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.176543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.176562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.327301 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:18:29 crc kubenswrapper[4722]: I0219 19:18:29.327425 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.022036 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:30:19.407838294 +0000 UTC Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.064472 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.064731 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.066140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.066247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.066281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.177225 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.178567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.178607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.178618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.580780 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.581049 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.582709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.582810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.582843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:30 crc kubenswrapper[4722]: I0219 19:18:30.587883 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:31 crc kubenswrapper[4722]: I0219 19:18:31.022859 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 22:20:11.157521609 +0000 UTC Feb 19 19:18:31 crc kubenswrapper[4722]: E0219 19:18:31.147656 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 19:18:31 crc kubenswrapper[4722]: I0219 19:18:31.179741 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:31 crc kubenswrapper[4722]: I0219 19:18:31.179883 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:31 crc kubenswrapper[4722]: I0219 19:18:31.181100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:31 crc kubenswrapper[4722]: I0219 19:18:31.181177 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:31 crc kubenswrapper[4722]: I0219 19:18:31.181196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:32 crc kubenswrapper[4722]: I0219 19:18:32.023655 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 12:21:00.358385568 +0000 UTC Feb 19 19:18:32 crc kubenswrapper[4722]: I0219 19:18:32.181701 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:32 crc kubenswrapper[4722]: I0219 19:18:32.183085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:32 crc kubenswrapper[4722]: I0219 19:18:32.183187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:32 crc kubenswrapper[4722]: I0219 19:18:32.183217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:32 crc kubenswrapper[4722]: I0219 19:18:32.187764 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:33 crc kubenswrapper[4722]: I0219 19:18:33.024013 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 03:39:25.675023436 +0000 UTC Feb 19 19:18:33 crc kubenswrapper[4722]: I0219 19:18:33.183512 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:33 crc kubenswrapper[4722]: I0219 19:18:33.184640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:33 crc kubenswrapper[4722]: I0219 19:18:33.184694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:33 crc kubenswrapper[4722]: I0219 19:18:33.184708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:34 crc kubenswrapper[4722]: I0219 19:18:34.024540 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 15:10:40.911787308 +0000 UTC Feb 19 19:18:34 crc kubenswrapper[4722]: W0219 19:18:34.778462 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 19:18:34 crc kubenswrapper[4722]: I0219 19:18:34.778573 4722 trace.go:236] Trace[334005646]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 19:18:24.777) (total time: 10001ms): Feb 19 19:18:34 crc kubenswrapper[4722]: Trace[334005646]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (19:18:34.778) Feb 19 19:18:34 crc kubenswrapper[4722]: Trace[334005646]: [10.001038871s] [10.001038871s] END Feb 19 19:18:34 crc kubenswrapper[4722]: E0219 19:18:34.778604 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.018573 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.024725 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 12:36:31.413463683 +0000 UTC Feb 19 19:18:35 crc kubenswrapper[4722]: W0219 19:18:35.315579 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.315729 4722 trace.go:236] Trace[971348987]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 19:18:25.314) (total time: 10001ms): Feb 19 19:18:35 crc kubenswrapper[4722]: Trace[971348987]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (19:18:35.315) Feb 19 19:18:35 crc kubenswrapper[4722]: Trace[971348987]: [10.001065212s] [10.001065212s] END Feb 19 19:18:35 crc kubenswrapper[4722]: E0219 19:18:35.315765 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.384368 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.384450 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.389066 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 19:18:35 crc kubenswrapper[4722]: I0219 19:18:35.389187 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 19:18:36 crc kubenswrapper[4722]: I0219 19:18:36.025380 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 00:07:56.126627027 +0000 UTC Feb 19 19:18:36 crc kubenswrapper[4722]: I0219 19:18:36.456195 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]log ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]etcd ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-filter ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-apiextensions-informers ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-apiextensions-controllers ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/crd-informer-synced ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-system-namespaces-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 19 19:18:36 crc kubenswrapper[4722]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 19 19:18:36 crc kubenswrapper[4722]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/bootstrap-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/start-kube-aggregator-informers ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-registration-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-discovery-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]autoregister-completion ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-openapi-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 19 19:18:36 crc kubenswrapper[4722]: livez check failed Feb 19 19:18:36 crc kubenswrapper[4722]: I0219 19:18:36.456261 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:18:37 crc kubenswrapper[4722]: I0219 19:18:37.026144 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 09:43:15.232094126 +0000 UTC Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.026706 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 11:08:56.843648626 +0000 UTC Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.830286 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.830504 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.832430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.832482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.832494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:38 crc kubenswrapper[4722]: I0219 19:18:38.851832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.027702 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:41:49.098759635 +0000 UTC Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.197906 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.198651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.198692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.198702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.212745 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.327017 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.327103 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:18:39 crc kubenswrapper[4722]: I0219 19:18:39.900310 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.007835 4722 apiserver.go:52] "Watching apiserver" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.014567 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.014843 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.015200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.015216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.015811 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.015941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.016033 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.016187 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.016258 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.016205 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.016319 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.017440 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.018125 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.018645 4722 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.021801 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.021806 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.021924 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.021913 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.022030 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.022242 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.025422 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.028010 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 15:35:16.813907048 +0000 UTC Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.088618 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.100830 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.112282 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.126300 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.137972 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.147379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.156970 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.165415 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.363016 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.365521 4722 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367588 4722 trace.go:236] Trace[2129662400]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 19:18:28.481) (total time: 11885ms): Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[2129662400]: ---"Objects listed" error: 11885ms (19:18:40.367) Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[2129662400]: [11.885945232s] [11.885945232s] END Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367628 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367747 4722 trace.go:236] Trace[1037160239]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 19:18:28.648) (total time: 11718ms): Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[1037160239]: ---"Objects listed" error: 11718ms (19:18:40.367) Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[1037160239]: [11.718727728s] [11.718727728s] END Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367764 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.368798 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.380973 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404125 4722 csr.go:261] certificate signing request csr-jw779 is approved, waiting to be issued Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404912 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54328->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404918 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54336->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404972 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54328->192.168.126.11:17697: read: connection reset by peer" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.405011 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54336->192.168.126.11:17697: read: connection reset by peer" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.414790 4722 csr.go:257] certificate signing request csr-jw779 is issued Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466752 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467033 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467093 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467114 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467141 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467205 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467274 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467330 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467393 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467439 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467500 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467613 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467789 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467831 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467853 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467898 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467941 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467963 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467216 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467649 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467736 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467925 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468210 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468229 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468223 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467918 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467969 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468285 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468887 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468911 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468946 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469064 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469081 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469103 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469169 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469204 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469226 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468937 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474805 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468958 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469858 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.470046 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.470620 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.470829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471785 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.472554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.472662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.472792 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.473966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474091 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474679 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475143 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475292 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475354 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475383 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475586 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475610 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475632 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475762 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475814 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475191 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475688 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475893 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475967 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475995 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.476474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.476774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.476791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478558 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478590 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478660 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478744 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478823 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478988 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478069 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478367 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479213 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479256 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479444 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479500 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479566 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478559 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478792 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479954 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480957 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481647 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481738 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482020 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482122 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482198 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482694 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482790 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482869 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.483207 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.483908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.483755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484446 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484793 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484939 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484987 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485238 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485346 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485435 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485482 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485505 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485710 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485866 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485936 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485958 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486002 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486025 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486047 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486070 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486142 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486181 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486205 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486346 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486428 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486445 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486609 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486771 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486846 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486864 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487003 4722 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487014 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487023 4722 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487033 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487041 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487050 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487059 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487068 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487077 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487086 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487095 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487103 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487113 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487121 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487130 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487139 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487165 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487178 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487187 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487197 4722 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487206 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487215 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487223 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487231 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487240 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487249 4722 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487257 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487268 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487277 4722 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487288 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487297 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487306 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487315 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487324 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487333 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487342 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487351 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487360 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487368 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487377 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487385 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487394 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487403 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487411 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487420 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487429 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487437 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487445 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487453 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487463 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487471 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487480 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487489 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487498 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487507 4722 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487516 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487525 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487536 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487548 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487560 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487572 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487582 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487591 4722 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487600 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487609 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487621 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487633 4722 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487645 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487657 4722 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487667 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487652 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487676 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487731 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487746 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487765 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487777 4722 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487795 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487807 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487817 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487831 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487846 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487855 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487865 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487875 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487885 4722 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487895 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488350 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488363 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488373 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488382 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488392 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488401 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488410 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488432 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488442 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488452 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488462 4722 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488497 4722 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488509 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488518 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488176 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488223 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488621 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488937 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489794 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.490177 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.490661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.490938 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491001 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491226 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491591 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491704 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491777 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491792 4722 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491938 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.492175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.492249 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.492550 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.493842 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:40.99382177 +0000 UTC m=+20.606172104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.494265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.494437 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.494493 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:40.9944773 +0000 UTC m=+20.606827624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.494520 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.494571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.495215 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.495304 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:40.995282345 +0000 UTC m=+20.607632669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495595 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495828 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495862 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496064 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496078 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496500 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496871 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.497552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.497776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498580 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.499043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.499703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.502690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.502865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.502889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503215 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503506 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503498 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503607 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503810 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504393 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504422 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504440 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504508 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:41.004488493 +0000 UTC m=+20.616838907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504204 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.505234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.505376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.505488 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.506015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.509240 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.510767 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.510440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.510858 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509465 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.510945 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:41.010921974 +0000 UTC m=+20.623272308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.511852 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512038 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512835 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512913 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512976 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.518280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.518588 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.518694 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519283 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.529497 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.534220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.534914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.544659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.561173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.561683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.567676 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590946 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591040 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591123 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591207 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591245 4722 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591257 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591267 4722 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591309 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591318 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591328 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591336 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591345 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591354 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591363 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591371 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591380 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591389 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591398 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591407 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591415 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591423 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591431 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591439 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591448 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591456 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591465 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591474 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591482 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591490 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591499 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591508 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591517 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591526 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591536 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591558 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591567 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591575 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591584 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591592 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591601 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591610 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591618 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591626 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591635 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591643 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591652 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591661 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591670 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591678 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591687 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591695 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591704 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591712 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591721 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591729 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591737 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591746 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591754 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591761 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591770 4722 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591778 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591785 4722 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591793 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591801 4722 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591809 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591817 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591824 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591832 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591840 4722 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591848 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591856 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591863 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591871 4722 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591879 4722 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591886 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591894 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591902 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591909 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591917 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591932 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591941 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591957 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591965 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591973 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591981 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591990 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591998 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592007 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592015 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592023 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592030 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592061 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.637377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.650799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.661137 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.663383 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624 WatchSource:0}: Error finding container 8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624: Status 404 returned error can't find the container with id 8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624 Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.674949 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a WatchSource:0}: Error finding container 37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a: Status 404 returned error can't find the container with id 37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.797244 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xq6bx"] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.797939 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.799953 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.800450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.800739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.801732 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.817183 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.838476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.849897 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.864221 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.873416 4722 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873534 4722 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873565 4722 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873585 4722 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873903 4722 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.873978 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb/status\": read tcp 38.102.83.195:44978->38.102.83.195:6443: use of closed network connection" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874167 4722 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874195 4722 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874347 4722 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874372 4722 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874389 4722 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874404 4722 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874421 4722 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.874446 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/events\": read tcp 38.102.83.195:44978->38.102.83.195:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-controller-manager-crc.1895bbf81d19a547 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:18:22.754661703 +0000 UTC m=+2.367012027,LastTimestamp:2026-02-19 19:18:22.754661703 +0000 UTC m=+2.367012027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874530 4722 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874549 4722 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874564 4722 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874579 4722 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874595 4722 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.886125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.891985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.893216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fad04006-ed10-4444-ae85-9c0a31a95466-serviceca\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.893262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjws\" (UniqueName: \"kubernetes.io/projected/fad04006-ed10-4444-ae85-9c0a31a95466-kube-api-access-vsjws\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.893296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fad04006-ed10-4444-ae85-9c0a31a95466-host\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.993959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fad04006-ed10-4444-ae85-9c0a31a95466-serviceca\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.994101 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:41.994080399 +0000 UTC m=+21.606430723 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994125 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjws\" (UniqueName: \"kubernetes.io/projected/fad04006-ed10-4444-ae85-9c0a31a95466-kube-api-access-vsjws\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fad04006-ed10-4444-ae85-9c0a31a95466-host\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fad04006-ed10-4444-ae85-9c0a31a95466-host\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.995263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fad04006-ed10-4444-ae85-9c0a31a95466-serviceca\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.013353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjws\" (UniqueName: \"kubernetes.io/projected/fad04006-ed10-4444-ae85-9c0a31a95466-kube-api-access-vsjws\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.028132 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 16:41:26.104576767 +0000 UTC Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.079862 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.080579 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.082318 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.083133 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.084547 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.085302 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.086086 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.087952 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.088841 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.090353 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.091249 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.092902 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.093623 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094042 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094344 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094649 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094711 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094770 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.094749274 +0000 UTC m=+21.707099598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094817 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094846 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094849 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094863 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094854 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094918 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.094902979 +0000 UTC m=+21.707253313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094927 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094940 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.09493023 +0000 UTC m=+21.707280564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094945 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.095025 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.094987772 +0000 UTC m=+21.707338106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.095664 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.096406 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.097683 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.098210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.098993 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.100520 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.101199 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.102541 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.103121 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.104548 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.104957 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.105625 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.106707 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.107189 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.108210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.108660 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.109646 4722 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.109752 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.111329 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.112218 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.112626 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.114182 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.114808 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.115728 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.116357 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.117387 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.117855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.118519 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.118946 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.119635 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.120643 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.121146 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.122049 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.122709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.123878 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.124393 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.125282 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.125712 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.126596 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.127181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.127631 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.131532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.148377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.179172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.191297 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.198849 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.205206 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.205826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.207312 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012" exitCode=255 Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.207367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.208805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.210489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.210521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4aa51c6299b68bb6f01f25018528b68e6381f4f60652d3b2252331cb08a10c52"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.211723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xq6bx" event={"ID":"fad04006-ed10-4444-ae85-9c0a31a95466","Type":"ContainerStarted","Data":"7769573710b422fe1497d25bafd7920f3215eb3d9eb32511a2440dc0bd1c2c91"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.213878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.213908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.213921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.216621 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.225768 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.233522 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lwpgw"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.233796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.233841 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w8zrl"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.234312 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.235904 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.235965 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.236249 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.236263 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.236670 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.238919 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.238997 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.239123 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.242129 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.257039 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.267115 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.278081 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.286241 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.299671 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.308021 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.315254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.329413 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.338129 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.346430 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.356620 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.365336 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.374118 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.383325 4722 scope.go:117] "RemoveContainer" containerID="e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.384113 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.396997 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9734f69-4441-4618-849c-54e0aca328e4-hosts-file\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbv9c\" (UniqueName: \"kubernetes.io/projected/e9734f69-4441-4618-849c-54e0aca328e4-kube-api-access-bbv9c\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b265ff4c-d096-4b39-8032-fe0b84354832-mcd-auth-proxy-config\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkfk8\" (UniqueName: \"kubernetes.io/projected/b265ff4c-d096-4b39-8032-fe0b84354832-kube-api-access-fkfk8\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b265ff4c-d096-4b39-8032-fe0b84354832-rootfs\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b265ff4c-d096-4b39-8032-fe0b84354832-proxy-tls\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.415973 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 19:13:40 +0000 UTC, rotation deadline is 2027-01-07 13:29:13.529073629 +0000 UTC Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.416035 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7722h10m32.113040436s for next certificate rotation Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.451609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.461871 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.475377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.493200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkfk8\" (UniqueName: \"kubernetes.io/projected/b265ff4c-d096-4b39-8032-fe0b84354832-kube-api-access-fkfk8\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b265ff4c-d096-4b39-8032-fe0b84354832-rootfs\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b265ff4c-d096-4b39-8032-fe0b84354832-proxy-tls\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9734f69-4441-4618-849c-54e0aca328e4-hosts-file\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbv9c\" (UniqueName: \"kubernetes.io/projected/e9734f69-4441-4618-849c-54e0aca328e4-kube-api-access-bbv9c\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b265ff4c-d096-4b39-8032-fe0b84354832-rootfs\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9734f69-4441-4618-849c-54e0aca328e4-hosts-file\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b265ff4c-d096-4b39-8032-fe0b84354832-mcd-auth-proxy-config\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.499251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b265ff4c-d096-4b39-8032-fe0b84354832-mcd-auth-proxy-config\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.502672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b265ff4c-d096-4b39-8032-fe0b84354832-proxy-tls\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.512957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.515248 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbv9c\" (UniqueName: \"kubernetes.io/projected/e9734f69-4441-4618-849c-54e0aca328e4-kube-api-access-bbv9c\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.521004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkfk8\" (UniqueName: \"kubernetes.io/projected/b265ff4c-d096-4b39-8032-fe0b84354832-kube-api-access-fkfk8\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.530595 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.548352 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.548731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.553529 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: W0219 19:18:41.568013 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9734f69_4441_4618_849c_54e0aca328e4.slice/crio-5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de WatchSource:0}: Error finding container 5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de: Status 404 returned error can't find the container with id 5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.569334 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: W0219 19:18:41.575049 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb265ff4c_d096_4b39_8032_fe0b84354832.slice/crio-39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397 WatchSource:0}: Error finding container 39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397: Status 404 returned error can't find the container with id 39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397 Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.586612 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.607981 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.608974 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7g5gg"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.609663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612421 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612485 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612542 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612693 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612810 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.622199 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jnvgg"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.622310 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.622561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.623740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.624660 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.644597 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.662746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.671739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.685755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.698842 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-kubelet\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75c45\" (UniqueName: \"kubernetes.io/projected/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-kube-api-access-75c45\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700187 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-socket-dir-parent\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-k8s-cni-cncf-io\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700253 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cni-binary-copy\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-etc-kubernetes\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700290 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-system-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n829t\" (UniqueName: \"kubernetes.io/projected/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-kube-api-access-n829t\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-os-release\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-os-release\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-multus\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-daemon-config\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cnibin\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cnibin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-netns\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-multus-certs\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-hostroot\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-system-cni-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-bin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-conf-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.709125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.712895 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.721977 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.734739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.744811 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.745924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.754242 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.765828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.776505 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.787422 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.797910 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-bin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-conf-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-kubelet\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75c45\" (UniqueName: \"kubernetes.io/projected/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-kube-api-access-75c45\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-socket-dir-parent\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-k8s-cni-cncf-io\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801217 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-bin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801250 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-kubelet\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-conf-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cni-binary-copy\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-etc-kubernetes\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-system-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n829t\" (UniqueName: \"kubernetes.io/projected/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-kube-api-access-n829t\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-socket-dir-parent\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-os-release\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-multus\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-daemon-config\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801461 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-etc-kubernetes\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cnibin\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-os-release\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-multus\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cnibin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801745 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cnibin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-netns\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-netns\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cnibin\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-multus-certs\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-multus-certs\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-os-release\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-os-release\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-k8s-cni-cncf-io\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-hostroot\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-system-cni-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-hostroot\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-system-cni-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-system-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802210 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-daemon-config\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cni-binary-copy\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.810503 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.818474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75c45\" (UniqueName: \"kubernetes.io/projected/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-kube-api-access-75c45\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.822397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n829t\" (UniqueName: \"kubernetes.io/projected/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-kube-api-access-n829t\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.823461 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.836056 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.849468 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.861233 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.873365 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.883565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.892796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.910363 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.939747 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.949931 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.959794 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: W0219 19:18:41.976443 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a80fcd7_8ac4_4e82_8f14_93d225898bb5.slice/crio-067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841 WatchSource:0}: Error finding container 067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841: Status 404 returned error can't find the container with id 067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841 Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.994666 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.003188 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.003413 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.003379084 +0000 UTC m=+23.615729418 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.005755 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.007113 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.010135 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.011029 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.014365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.027327 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.029920 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:48:21.448865167 +0000 UTC Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.042682 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.060300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.071029 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.071129 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.071277 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.071384 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.071456 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.071554 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104009 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104697 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104987 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105010 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105076 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105447 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105482 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105500 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105534 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105625 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105639 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105558 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105539185 +0000 UTC m=+23.717889509 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105646 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105657549 +0000 UTC m=+23.718008083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105704 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105773 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105740512 +0000 UTC m=+23.718090836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105860 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105844195 +0000 UTC m=+23.718194519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.119505 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.141230 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.171244 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.180565 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206529 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206757 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206869 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206890 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207517 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.210052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.218176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerStarted","Data":"c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.218223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerStarted","Data":"ffb06d45fa7531890253050a0ce71077ac6c26811651a46a6310b828e8171528"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.219383 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.220113 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.221696 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.222251 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.223787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.223815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.223826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.224816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lwpgw" event={"ID":"e9734f69-4441-4618-849c-54e0aca328e4","Type":"ContainerStarted","Data":"d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.224840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lwpgw" event={"ID":"e9734f69-4441-4618-849c-54e0aca328e4","Type":"ContainerStarted","Data":"5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.226466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xq6bx" event={"ID":"fad04006-ed10-4444-ae85-9c0a31a95466","Type":"ContainerStarted","Data":"ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.227680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.227709 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.228001 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.246314 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.273732 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.300697 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.322252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.329498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: W0219 19:18:42.335615 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb7c404_f96e_43a7_b20f_b45d856c75a5.slice/crio-80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4 WatchSource:0}: Error finding container 80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4: Status 404 returned error can't find the container with id 80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4 Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.340737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.380337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.408254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.449304 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.480051 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.513421 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.520577 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.568627 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.606451 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.660937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.690983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.735857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.769239 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.808294 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.847992 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.905325 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.928243 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.969685 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.021073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.030696 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 07:45:39.951751845 +0000 UTC Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.060708 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.090905 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.136091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.173826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.208414 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.231900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.233021 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" exitCode=0 Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.233094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.233190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.234050 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330" exitCode=0 Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.234166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.247143 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.291432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.328065 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.372828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.409852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.449662 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.493925 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.531175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.569044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.608653 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.647799 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.693247 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.734320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.769625 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.808498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.855974 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.025268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.025446 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.02540642 +0000 UTC m=+27.637756794 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.031246 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 23:14:07.886909713 +0000 UTC Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.070973 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.071035 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.071219 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.071260 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.071003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.071870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126715 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126826 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126840 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.126815558 +0000 UTC m=+27.739165922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126825 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126865 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126850 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126881 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126889 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126928 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.126911451 +0000 UTC m=+27.739261785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126952 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.126940712 +0000 UTC m=+27.739291046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126964 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.127109 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.127079156 +0000 UTC m=+27.739429520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.242967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243027 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243050 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243061 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.245796 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1" exitCode=0 Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.245827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.259978 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.282432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.295200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.311281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.324560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.337750 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.353457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.369349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.379263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.388275 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.403085 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.420367 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.434018 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.032402 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:36:58.029039836 +0000 UTC Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.250704 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b" exitCode=0 Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.250753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b"} Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.276234 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.288188 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.303804 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.316627 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.327381 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.339537 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.350254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.365267 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.378543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.390268 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.408435 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.419906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.430425 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.033076 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:33:09.769161777 +0000 UTC Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.071076 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.071146 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.071080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.071297 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.071433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.071603 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.257605 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b" exitCode=0 Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.257647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b"} Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.264415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.277301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.293235 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.310195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.325680 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.331089 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.334903 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.339862 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.339989 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.352646 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.370067 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.384842 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.401168 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.409731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.424476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.434671 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.450852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.460641 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.476983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.492326 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.509530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.529766 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.548740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.561485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.573364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.584054 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.596310 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.611051 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.627341 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.638841 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.651035 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.768965 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.770999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.771037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.771046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.771185 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.779598 4722 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.779981 4722 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.813197 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817321 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817404 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.830094 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.834955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.834998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.835012 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.835030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.835043 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.849279 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853529 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.869516 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874986 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.875011 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.890175 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.890333 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892778 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996408 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.033891 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 13:47:12.644790123 +0000 UTC Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.099814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.099920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.100022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.100130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.100258 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203479 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.271759 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009" exitCode=0 Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.271881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.290134 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.307015 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.322505 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.341309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.356408 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.373774 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.391383 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.407560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.409718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.409764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.410406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.410597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.410657 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.423928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.435944 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.448731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.460669 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.476353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.490367 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513405 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616366 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719808 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822434 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925250 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027836 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.034133 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 15:04:38.911996376 +0000 UTC Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.066618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.066872 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.06685158 +0000 UTC m=+35.679201914 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.070615 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.070642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.070688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.070742 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.070870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.070989 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130288 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167753 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167849 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.167830935 +0000 UTC m=+35.780181259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167923 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167937 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167949 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167979 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.167970549 +0000 UTC m=+35.780320873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167989 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168014 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168039 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168050 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168084 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.168052422 +0000 UTC m=+35.780402786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168121 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.168103483 +0000 UTC m=+35.780453837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233663 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.280752 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61" exitCode=0 Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.280833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.288003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.288451 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.288493 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.299371 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.323735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.326643 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335777 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.344650 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.360801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.373584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.394494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.406595 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.420213 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.433622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437935 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.447818 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.461647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.477657 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.487928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.496770 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.509228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.520068 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.529930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.537960 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540122 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540163 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.551210 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.562192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.571860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.581452 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.589091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.599932 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.611146 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.629504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.640073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.656271 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.744954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.744999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.745013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.745029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.745041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847329 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.921883 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949494 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.034857 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 02:07:25.167277077 +0000 UTC Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.111688 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155109 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257292 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.295830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerStarted","Data":"eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.296440 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.313836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.329117 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.339318 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.351462 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.365517 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.375604 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.388853 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.402601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.415626 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.428378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.442280 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.460861 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462313 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462328 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.472514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.489886 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.505364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.521285 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.535565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.552660 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565428 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.566483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.579730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.598055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.617399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.631684 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.643406 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.658716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667977 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.671382 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.694712 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.707947 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.722314 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770926 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873192 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975398 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.035455 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:18:18.264110595 +0000 UTC Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.071204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.071288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.071294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:50 crc kubenswrapper[4722]: E0219 19:18:50.071438 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:50 crc kubenswrapper[4722]: E0219 19:18:50.071546 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:50 crc kubenswrapper[4722]: E0219 19:18:50.071702 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.079020 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.182679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.182817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.182843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.183288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.183308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286582 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.389790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390227 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492877 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595501 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813298 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.877134 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.915949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.915979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.915988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.916013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.916036 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.018966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.036502 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 02:35:40.319276734 +0000 UTC Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.088540 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.104614 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.118225 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122963 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.132209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.144299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.165996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.182379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.196628 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.208263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225710 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.232654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.243707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.260622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.276741 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.291637 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.327963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328055 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.429970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430349 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548273 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651145 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858206 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.962037 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.037347 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 00:19:31.533773245 +0000 UTC Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064471 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.070827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:52 crc kubenswrapper[4722]: E0219 19:18:52.070912 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.070927 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.070942 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:52 crc kubenswrapper[4722]: E0219 19:18:52.070982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:52 crc kubenswrapper[4722]: E0219 19:18:52.071038 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166698 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270498 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.307996 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/0.log" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.312175 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218" exitCode=1 Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.312218 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.313041 4722 scope.go:117] "RemoveContainer" containerID="4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.337987 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.357616 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374469 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.379281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.401949 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.414525 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.428202 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.441655 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.456021 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.476736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480629 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.494914 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.508192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.521018 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.546815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.571095 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685223 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787459 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.890893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.890967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.890992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.891022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.891042 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.037890 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:52:56.559305319 +0000 UTC Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095643 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198171 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300234 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.318376 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/0.log" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.322741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.323220 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.338897 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.353566 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.369195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.382211 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.395248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403182 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.410014 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.421658 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.436033 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.451354 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.464379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.476901 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.489605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.504059 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505984 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.525433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.608830 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4"] Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.609460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.609928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610257 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.611324 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.612177 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.626223 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.654490 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.667609 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.680586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.696338 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.709502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.712006 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lqm\" (UniqueName: \"kubernetes.io/projected/20b917d0-317d-4ce9-96e2-b1aa95f89663-kube-api-access-26lqm\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.724686 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.737393 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.752758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.769831 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.785807 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.799965 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.810446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813890 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lqm\" (UniqueName: \"kubernetes.io/projected/20b917d0-317d-4ce9-96e2-b1aa95f89663-kube-api-access-26lqm\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.823028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.823860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.830089 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.831865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.846033 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.848263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lqm\" (UniqueName: \"kubernetes.io/projected/20b917d0-317d-4ce9-96e2-b1aa95f89663-kube-api-access-26lqm\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916659 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.929245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: W0219 19:18:53.949781 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b917d0_317d_4ce9_96e2_b1aa95f89663.slice/crio-2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd WatchSource:0}: Error finding container 2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd: Status 404 returned error can't find the container with id 2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018753 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.038311 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 14:00:41.431354651 +0000 UTC Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.070931 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.071003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.071091 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.071003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.071245 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.071393 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121448 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224670 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326849 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.329599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" event={"ID":"20b917d0-317d-4ce9-96e2-b1aa95f89663","Type":"ContainerStarted","Data":"0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.329667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" event={"ID":"20b917d0-317d-4ce9-96e2-b1aa95f89663","Type":"ContainerStarted","Data":"327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.329691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" event={"ID":"20b917d0-317d-4ce9-96e2-b1aa95f89663","Type":"ContainerStarted","Data":"2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.333085 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.334647 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/0.log" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.338929 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" exitCode=1 Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.338982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.339048 4722 scope.go:117] "RemoveContainer" containerID="4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.340090 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.340409 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.346102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.367518 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.384601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.402971 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.422736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429451 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.446587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.458394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.469037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.488003 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.496878 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.509016 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.519117 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.526889 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531514 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.537733 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.553494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.572172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.584707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.594401 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.604543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.613885 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.621453 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.629761 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.640423 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.651203 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.661572 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.675044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.686787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.696863 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.714678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.730657 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840200 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943768 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.039125 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 00:59:23.403723676 +0000 UTC Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046931 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.047009 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.107783 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-s6hhp"] Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.112263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.112432 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.133691 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.151751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.169264 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.182928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.198586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.219042 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.235318 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpp4f\" (UniqueName: \"kubernetes.io/projected/493acad5-7300-4941-9311-19b3d5f21786-kube-api-access-gpp4f\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.235417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.235203 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253474 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.254702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.273561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.288214 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.303727 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.326570 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.336247 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.336320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpp4f\" (UniqueName: \"kubernetes.io/projected/493acad5-7300-4941-9311-19b3d5f21786-kube-api-access-gpp4f\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.336507 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.336625 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:55.836595654 +0000 UTC m=+35.448945998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.343528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.345288 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.350441 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.350700 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.357031 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.359422 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.368767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpp4f\" (UniqueName: \"kubernetes.io/projected/493acad5-7300-4941-9311-19b3d5f21786-kube-api-access-gpp4f\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.390568 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.406524 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.425415 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.438501 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.455039 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.471943 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.496596 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.509582 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.526673 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.541821 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.552567 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561599 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.565101 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.575816 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.587399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.603777 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.614108 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.626701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.638240 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.667991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668076 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770820 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.841924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.842276 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.842405 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.842368587 +0000 UTC m=+36.454718961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874727 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977650 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.039355 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:53:36.731192633 +0000 UTC Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.071227 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.071468 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.071267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.071261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.071722 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.071605 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.145771 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.146034 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.146004293 +0000 UTC m=+51.758354647 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184848 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247681 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247785 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.247760813 +0000 UTC m=+51.860111167 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247814 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247837 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247833 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247934 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.247911778 +0000 UTC m=+51.860262122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247835 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247976 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247856 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.248109 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.248077673 +0000 UTC m=+51.860428027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247991 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.248199 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.248187576 +0000 UTC m=+51.860537910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.288014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494630 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700824 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804355 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.860299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.860493 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.860586 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:58.860563319 +0000 UTC m=+38.472913703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906821 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906893 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009549 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.039537 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:38:40.736678785 +0000 UTC Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.071193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.071357 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113463 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133510 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.155662 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161611 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.183529 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.204886 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.225486 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.241333 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.241557 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449239 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552850 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655846 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758717 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861779 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964408 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.039696 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:06:51.551069931 +0000 UTC Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067632 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.070912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.070997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.071025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.071296 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.071428 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170414 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.479874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.479969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.479994 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.480024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.480047 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583706 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686350 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.759715 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.784748 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789494 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.800670 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.818228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.833969 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.847815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.864996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.879791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.880038 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.880106 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:02.880084797 +0000 UTC m=+42.492435151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.886618 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892238 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.907916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.929579 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.950437 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.964469 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.987293 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001264 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.006550 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.020132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.036773 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.039825 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 02:43:46.344129358 +0000 UTC Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.084718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.084672 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: E0219 19:18:59.084909 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.103768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104313 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.207870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208389 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415335 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725818 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829741 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.933334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.933673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.933898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.934079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.934361 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037997 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.040507 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:55:35.662232458 +0000 UTC Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.070900 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.070940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.070900 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:00 crc kubenswrapper[4722]: E0219 19:19:00.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:00 crc kubenswrapper[4722]: E0219 19:19:00.071189 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:00 crc kubenswrapper[4722]: E0219 19:19:00.071302 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141942 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245303 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348544 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.760982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864770 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968478 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.040655 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 04:30:27.898358249 +0000 UTC Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.070498 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:01 crc kubenswrapper[4722]: E0219 19:19:01.072723 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.077517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.077751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.077902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.078055 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.078216 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.092091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.110317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.131497 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.149558 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.168271 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181684 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.185063 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.204252 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.232517 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.246521 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.270803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284638 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.295044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.314316 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.328097 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.342881 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.355338 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.374037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.387788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.388186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.388378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.388679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.389055 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492588 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595092 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698383 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800312 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800336 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903870 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.041552 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 12:50:43.977918212 +0000 UTC Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.070986 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.071090 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.071180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.071366 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.071518 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.071693 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.109587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.109916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.110051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.110358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.110513 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.316575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.316930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.317067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.317294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.317430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420633 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.524073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525668 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629453 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732798 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835614 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.924744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.924921 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.925600 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:10.925564294 +0000 UTC m=+50.537914658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939364 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.041735 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 19:13:16.842382981 +0000 UTC Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.041968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.070722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:03 crc kubenswrapper[4722]: E0219 19:19:03.070954 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145142 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247405 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350514 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.453467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.453858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.454041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.454401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.454709 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557706 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.660534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.660829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.661050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.661258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.661436 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.764680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765932 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869685 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.972510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.972940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.973330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.973687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.974066 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.042524 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 20:04:50.819418203 +0000 UTC Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.070848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:04 crc kubenswrapper[4722]: E0219 19:19:04.071281 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.070999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:04 crc kubenswrapper[4722]: E0219 19:19:04.071640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.070953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:04 crc kubenswrapper[4722]: E0219 19:19:04.071861 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.078016 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.181830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.182245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.182503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.182767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.183004 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.285992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286980 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.391090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494645 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598270 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701530 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.804398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805346 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.908972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909084 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012954 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.042679 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:42:39.312512215 +0000 UTC Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.070795 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:05 crc kubenswrapper[4722]: E0219 19:19:05.070990 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117270 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.323860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.530610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.530961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.531059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.531178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.531276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634084 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737295 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840782 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944515 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.043513 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:28:33.136304715 +0000 UTC Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.071091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.071184 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:06 crc kubenswrapper[4722]: E0219 19:19:06.071398 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:06 crc kubenswrapper[4722]: E0219 19:19:06.071566 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.071870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:06 crc kubenswrapper[4722]: E0219 19:19:06.071982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150670 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.356967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357053 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.459988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563949 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668407 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770784 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873825 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.978042 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.044478 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:47:41.085538751 +0000 UTC Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.071084 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.071288 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.083925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084093 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.186988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187287 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.291021 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602275 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.618000 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.640035 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646742 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.663859 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668331 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.688820 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693702 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.713113 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.717623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.717783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.717886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.718038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.718174 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.741721 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.742280 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744959 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.848658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.848750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.848767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.849475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.849502 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.953957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954123 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.044617 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 11:27:15.451662328 +0000 UTC Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057334 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.070624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.070696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.070696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:08 crc kubenswrapper[4722]: E0219 19:19:08.070788 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:08 crc kubenswrapper[4722]: E0219 19:19:08.071186 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:08 crc kubenswrapper[4722]: E0219 19:19:08.071051 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.161091 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.265297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.265956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.266245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.266431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.266642 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369948 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577482 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679441 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782409 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885616 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.045566 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 22:32:25.168409693 +0000 UTC Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.071100 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:09 crc kubenswrapper[4722]: E0219 19:19:09.071336 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091491 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195295 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401486 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504853 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608280 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711627 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815315 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919431 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.023076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024576 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.046494 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 02:03:14.502098327 +0000 UTC Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.069635 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.070571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.070682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.070571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:10 crc kubenswrapper[4722]: E0219 19:19:10.070743 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:10 crc kubenswrapper[4722]: E0219 19:19:10.070853 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:10 crc kubenswrapper[4722]: E0219 19:19:10.070945 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.072316 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.086867 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.089743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.107881 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.120429 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127607 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.136822 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.153489 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.168702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.186856 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.210564 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.227580 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232944 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.250973 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.266575 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.284232 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.311380 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.327563 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338318 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.342283 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.354573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.408831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.413477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.415545 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.432868 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.440932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.440983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.441001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.441023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.441041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.448678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.462734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.472851 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.485654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.499512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.511875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.526263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543518 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.544860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.561317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.577638 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.594176 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.608961 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.620539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.639410 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645675 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.650891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.662809 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748917 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.851704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852511 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.021115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.021278 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.021691 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:27.021670648 +0000 UTC m=+66.634020982 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.047126 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:15:53.9038052 +0000 UTC Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058423 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.071111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.071426 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.090072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.105730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.121051 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.133207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.147625 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159998 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.165057 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.183668 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.199296 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.214323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.228027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.242095 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.264751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.283815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.297837 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.316941 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.348990 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367224 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.423996 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.424964 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.427998 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" exitCode=1 Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.428049 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.428095 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.428925 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.429125 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.446231 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.458846 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469784 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.474769 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.489485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.500349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.508894 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.517178 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.531826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.544186 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.553493 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.564226 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.574387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575328 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.586559 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.600750 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.622522 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.642320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.655027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678452 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883573 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986230 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.047582 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 23:28:33.964837636 +0000 UTC Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.071446 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.071626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.072235 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.072349 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.072385 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.072511 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089981 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192965 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.232099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.232344 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.232305372 +0000 UTC m=+83.844655706 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295121 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333791 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333813 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333826 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333882 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.333866956 +0000 UTC m=+83.946217290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333873 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333998 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334116 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334252 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334008 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.333974549 +0000 UTC m=+83.946324903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333878 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334476 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.334371672 +0000 UTC m=+83.946722036 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334549 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.334530967 +0000 UTC m=+83.946881421 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.397828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398342 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.433781 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.437126 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.437410 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.457532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.484851 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.497482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501738 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.510488 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.520479 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.532553 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.548299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.563830 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.575530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.586736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.601222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606386 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.610940 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.622604 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.634656 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.644302 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.656072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.669171 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708887 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916829 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.019957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020137 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.048310 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:39:11.305739652 +0000 UTC Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.070982 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:13 crc kubenswrapper[4722]: E0219 19:19:13.071233 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123349 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225888 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431380 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534229 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636851 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739797 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.945956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946051 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048427 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 18:29:40.759575137 +0000 UTC Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048821 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.070661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:14 crc kubenswrapper[4722]: E0219 19:19:14.070799 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.070657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.070661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:14 crc kubenswrapper[4722]: E0219 19:19:14.070893 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:14 crc kubenswrapper[4722]: E0219 19:19:14.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256707 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360267 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462964 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565849 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667884 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770535 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872756 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974889 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.049804 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 15:57:14.419190545 +0000 UTC Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.070622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:15 crc kubenswrapper[4722]: E0219 19:19:15.070814 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.076971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077078 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180212 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386210 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488772 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591745 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694432 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797919 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900450 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002930 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.050831 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 05:17:01.758381173 +0000 UTC Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.070437 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.070461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:16 crc kubenswrapper[4722]: E0219 19:19:16.070580 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.070621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:16 crc kubenswrapper[4722]: E0219 19:19:16.070674 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:16 crc kubenswrapper[4722]: E0219 19:19:16.070810 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208775 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312285 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518319 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621748 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724958 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933682 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.051487 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 23:40:33.946486748 +0000 UTC Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.070556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:17 crc kubenswrapper[4722]: E0219 19:19:17.070733 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140775 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244508 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348396 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451794 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554807 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657998 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760136 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760179 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862608 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965410 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.051678 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 15:53:46.342556142 +0000 UTC Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068313 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068348 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.070495 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.070628 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.070789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.070840 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.070937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.070988 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.155415 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159709 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.171250 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174921 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.186650 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190900 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.203898 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.219969 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.220184 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.221960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.221987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.221998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.222012 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.222022 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324929 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427377 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427439 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427501 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531324 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531384 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.635022 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.738707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739585 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841888 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944845 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047283 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.052591 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 18:44:00.412904423 +0000 UTC Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.070924 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:19 crc kubenswrapper[4722]: E0219 19:19:19.071103 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150435 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252944 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459532 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563742 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666701 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872971 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977521 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.053520 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:41:03.878988016 +0000 UTC Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.070346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:20 crc kubenswrapper[4722]: E0219 19:19:20.070718 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.070822 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.070812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:20 crc kubenswrapper[4722]: E0219 19:19:20.070969 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:20 crc kubenswrapper[4722]: E0219 19:19:20.071183 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183885 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287381 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390448 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.493928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.493981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.494003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.494033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.494055 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596393 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802680 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010219 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.054088 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:18:29.741901914 +0000 UTC Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.070678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:21 crc kubenswrapper[4722]: E0219 19:19:21.070877 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.095779 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.113124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.113488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.114016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.114526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.114941 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.113894 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.130872 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.145580 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.160159 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.180194 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.190980 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.203904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.220082 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.234751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.246836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.267008 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.280838 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.296968 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.318135 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322216 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322260 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.337817 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.353257 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.424483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.424808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.424932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.425048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.425138 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528206 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734883 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941278 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044210 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.054581 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 04:05:54.162692732 +0000 UTC Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.071101 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.071204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:22 crc kubenswrapper[4722]: E0219 19:19:22.071224 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:22 crc kubenswrapper[4722]: E0219 19:19:22.071374 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.071767 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:22 crc kubenswrapper[4722]: E0219 19:19:22.072093 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.147015 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249385 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351369 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454565 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557909 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557955 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661600 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764712 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.055191 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 20:36:53.755172638 +0000 UTC Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.071434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:23 crc kubenswrapper[4722]: E0219 19:19:23.071650 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073544 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.279677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280525 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384826 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.488252 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.590951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.591444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.591591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.591766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.592672 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.696289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.696678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.696864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.697046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.697467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.800969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801892 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.905226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.905938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.906049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.906180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.906294 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.009600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.010309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.015588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.015836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.016054 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.055938 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 08:04:12.153452002 +0000 UTC Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.070679 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:24 crc kubenswrapper[4722]: E0219 19:19:24.071248 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.071052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:24 crc kubenswrapper[4722]: E0219 19:19:24.072473 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.071010 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:24 crc kubenswrapper[4722]: E0219 19:19:24.073283 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.119992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.224323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.224723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.224887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.225060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.225269 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328286 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430908 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533954 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.635924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.635975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.635989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.636007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.636021 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738687 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841993 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944635 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047330 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.056397 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 09:19:33.135184471 +0000 UTC Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.070799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:25 crc kubenswrapper[4722]: E0219 19:19:25.070923 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149464 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252239 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.356742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357321 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357377 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463626 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566590 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669466 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772220 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874591 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977341 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.056520 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 16:45:29.938180411 +0000 UTC Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.070887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.071002 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.071157 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.071217 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.071505 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.071679 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.071900 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.072069 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079545 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182715 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295834 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398102 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499925 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602325 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704753 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.807935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.807981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.807998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.808022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.808040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911443 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017855 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.057395 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 05:23:48.335135314 +0000 UTC Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.071289 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:27 crc kubenswrapper[4722]: E0219 19:19:27.071481 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.097020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:27 crc kubenswrapper[4722]: E0219 19:19:27.097235 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:27 crc kubenswrapper[4722]: E0219 19:19:27.097321 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:59.09729572 +0000 UTC m=+98.709646074 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120573 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224213 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429142 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429213 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633680 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735606 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837896 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941696 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044960 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.057927 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 23:02:33.407806425 +0000 UTC Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.070279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.070304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.070278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.070412 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.070527 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.070622 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147357 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249961 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352172 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454930 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557428 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.633676 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637146 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.648686 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.662278 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665257 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.677784 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.680965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681057 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.692846 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.692954 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694452 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796780 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.898892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.898972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.898992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.899019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.899038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001359 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.058344 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 10:37:06.326519749 +0000 UTC Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.071182 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:29 crc kubenswrapper[4722]: E0219 19:19:29.071423 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103579 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206839 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505094 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/0.log" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505212 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" containerID="5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877" exitCode=1 Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerDied","Data":"5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505816 4722 scope.go:117] "RemoveContainer" containerID="5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516627 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.518190 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.534667 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.547526 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.562098 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.579326 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.591699 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.611915 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619516 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.629109 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.657205 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.671782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.683217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.694929 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.707746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.718132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721485 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.728197 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.737753 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.748402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823962 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027671 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.058898 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 20:30:00.302078097 +0000 UTC Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.070841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.070878 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.070937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:30 crc kubenswrapper[4722]: E0219 19:19:30.071083 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:30 crc kubenswrapper[4722]: E0219 19:19:30.071212 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:30 crc kubenswrapper[4722]: E0219 19:19:30.071372 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129495 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.334134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.334669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.334873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.335033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.335238 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.438990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439093 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.510967 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/0.log" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.511418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.528707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541521 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.549629 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.567089 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.583040 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.595544 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.608032 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.620666 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.634697 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.643983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.647049 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.658209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.673701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.686565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.700221 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.714448 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.737465 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747231 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.751379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.769495 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951673 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053932 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.059008 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:54:21.377737464 +0000 UTC Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.070967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:31 crc kubenswrapper[4722]: E0219 19:19:31.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.086912 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.100233 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.112394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.123762 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.137677 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.149355 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156083 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.163234 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.175114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.186110 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.199561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.212697 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.223380 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.235918 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.248734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258656 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.273874 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.285299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.296791 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361177 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361245 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464359 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567418 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874965 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977235 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.059198 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:54:46.073560715 +0000 UTC Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.070596 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.070650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.070710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:32 crc kubenswrapper[4722]: E0219 19:19:32.070735 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:32 crc kubenswrapper[4722]: E0219 19:19:32.070802 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:32 crc kubenswrapper[4722]: E0219 19:19:32.070873 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182136 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387762 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593393 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696711 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696720 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.901964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004139 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004290 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.059573 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:15:55.834092976 +0000 UTC Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.071045 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:33 crc kubenswrapper[4722]: E0219 19:19:33.071218 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107122 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209515 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414240 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516726 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618536 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720540 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.060139 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 20:45:04.816004198 +0000 UTC Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.070802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.070847 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.070903 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:34 crc kubenswrapper[4722]: E0219 19:19:34.071007 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:34 crc kubenswrapper[4722]: E0219 19:19:34.071180 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:34 crc kubenswrapper[4722]: E0219 19:19:34.071254 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130778 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336994 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439578 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439594 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541913 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.643915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644130 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749609 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852264 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955402 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.057945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.057991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.058003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.058022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.058035 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.061144 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:06:52.48138202 +0000 UTC Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.070680 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:35 crc kubenswrapper[4722]: E0219 19:19:35.070849 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262965 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365886 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469089 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676264 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.779899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780197 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883657 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883698 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986704 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.062272 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:53:17.349207934 +0000 UTC Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.070812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.070827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.070833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:36 crc kubenswrapper[4722]: E0219 19:19:36.071239 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:36 crc kubenswrapper[4722]: E0219 19:19:36.071034 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:36 crc kubenswrapper[4722]: E0219 19:19:36.071369 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088871 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088914 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191819 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504827 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607598 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917226 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.062630 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:07:21.478431855 +0000 UTC Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.071550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:37 crc kubenswrapper[4722]: E0219 19:19:37.071761 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.084031 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122952 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226431 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329647 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432439 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432477 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534967 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.637972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741504 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844577 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948651 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051752 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.063190 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:04:01.257382823 +0000 UTC Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.070548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.070602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.070555 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.070744 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.070839 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.070957 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155374 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362447 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362498 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.465713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.465790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.466019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.466049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.466072 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672777 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775985 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.858014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.881121 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891534 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.908367 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912746 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.934061 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939801 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.959890 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965273 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.986404 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.986681 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989330 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.064184 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 23:34:30.023516193 +0000 UTC Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.071082 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:39 crc kubenswrapper[4722]: E0219 19:19:39.071888 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.072364 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092291 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.196701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197505 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404846 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.540481 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.543008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.543561 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.565650 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.578494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.591849 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.603796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610649 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.617492 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.631317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.643795 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.657411 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.675512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.698335 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712739 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.716378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.729269 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.743860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.753586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.764703 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.774249 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.785398 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.798548 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815794 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918675 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.021965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.064853 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 15:34:30.186356001 +0000 UTC Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.070531 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.070602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.070610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.070754 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.070912 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.071132 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125999 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228744 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331677 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.550797 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.551902 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.555767 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" exitCode=1 Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.555835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.555922 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.557077 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.557433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.583538 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.602237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.620141 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.642048 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.655976 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.674339 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.689253 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.706217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.720696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.731388 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.739932 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747292 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.752855 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.762455 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.773592 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.782324 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.794075 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.822182 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:39Z\\\",\\\"message\\\":\\\"io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007681e6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.16],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 19:19:39.913543 6799 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neig\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.840057 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.849943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.849979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.850008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.850026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.850040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952772 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055713 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.066027 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:49:59.929348512 +0000 UTC Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.070578 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:41 crc kubenswrapper[4722]: E0219 19:19:41.070698 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.086245 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.107456 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.144323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159613 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.172890 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.193204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.205457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.217225 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.233828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.243066 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.254179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.262002 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.269377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.285235 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.295557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.304370 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.320696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.348416 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:39Z\\\",\\\"message\\\":\\\"io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007681e6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.16],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 19:19:39.913543 6799 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neig\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.361172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.374175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465794 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.560064 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.562728 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:19:41 crc kubenswrapper[4722]: E0219 19:19:41.562848 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.579676 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.610700 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:39Z\\\",\\\"message\\\":\\\"io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007681e6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.16],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 19:19:39.913543 6799 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neig\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.622315 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.635052 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.651484 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.664620 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.681367 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.694810 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.705048 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.715904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.728392 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.742275 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.758410 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.771963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772040 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772022 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772076 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.785207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.798327 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.812009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.822731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874293 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.066767 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:14:43.971706931 +0000 UTC Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.071196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.071245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.071206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:42 crc kubenswrapper[4722]: E0219 19:19:42.071357 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:42 crc kubenswrapper[4722]: E0219 19:19:42.071527 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:42 crc kubenswrapper[4722]: E0219 19:19:42.071688 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080893 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185433 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288977 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391139 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494285 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596915 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700202 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906621 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.008974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009106 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.067598 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:38:44.365254537 +0000 UTC Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.071372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:43 crc kubenswrapper[4722]: E0219 19:19:43.071620 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111994 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214526 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.317859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.317960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.317979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.318002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.318016 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427886 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531565 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634897 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738317 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841691 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944097 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046882 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.067717 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:13:23.722768903 +0000 UTC Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.071108 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.071204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.071111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.071314 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.071425 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.071552 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150462 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.275705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.275895 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.275854089 +0000 UTC m=+147.888204453 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357932 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377979 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378040 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378070 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378123 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378124 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378145 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378079 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378192 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378214 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.378140817 +0000 UTC m=+147.990491141 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378259 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.3782351 +0000 UTC m=+147.990585464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378284 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.378271211 +0000 UTC m=+147.990621575 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378306 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.378294701 +0000 UTC m=+147.990645065 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.564982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565115 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667262 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770317 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872418 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.975911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.975979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.976052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.976077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.976095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.068606 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:28:54.133234058 +0000 UTC Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.071305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:45 crc kubenswrapper[4722]: E0219 19:19:45.071496 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078741 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182481 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286264 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389478 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492282 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595603 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802221 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905830 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.008995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009070 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.069615 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 09:55:36.57119654 +0000 UTC Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.070938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.071051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.070938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:46 crc kubenswrapper[4722]: E0219 19:19:46.071214 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:46 crc kubenswrapper[4722]: E0219 19:19:46.071292 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:46 crc kubenswrapper[4722]: E0219 19:19:46.071380 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.117883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.117964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.117989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.118022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.118045 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220857 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324629 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529547 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632714 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736389 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839175 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942195 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044344 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.069823 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:34:20.00876621 +0000 UTC Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.072316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:47 crc kubenswrapper[4722]: E0219 19:19:47.072493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250322 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353981 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456990 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560678 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.663911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.663991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.664016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.664049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.664072 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766804 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869179 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972194 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070135 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:05:39.528372624 +0000 UTC Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070388 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:48 crc kubenswrapper[4722]: E0219 19:19:48.070462 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:48 crc kubenswrapper[4722]: E0219 19:19:48.070617 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:48 crc kubenswrapper[4722]: E0219 19:19:48.070678 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.073998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074065 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177551 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280783 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383703 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590327 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694972 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903271 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006729 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.070973 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:14:34.340507253 +0000 UTC Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.071270 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:49 crc kubenswrapper[4722]: E0219 19:19:49.071612 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110574 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.206875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.206960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.206987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.207129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.207193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240353 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.275555 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s"] Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.276097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280263 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280533 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280610 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280553 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338708 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338759 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.340741 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=63.340710849 podStartE2EDuration="1m3.340710849s" podCreationTimestamp="2026-02-19 19:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.340132991 +0000 UTC m=+88.952483355" watchObservedRunningTime="2026-02-19 19:19:49.340710849 +0000 UTC m=+88.953061203" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.341078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.34106602 podStartE2EDuration="1m8.34106602s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.314205776 +0000 UTC m=+88.926556110" watchObservedRunningTime="2026-02-19 19:19:49.34106602 +0000 UTC m=+88.953416384" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.370241 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xq6bx" podStartSLOduration=69.370215236 podStartE2EDuration="1m9.370215236s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.369861395 +0000 UTC m=+88.982211759" watchObservedRunningTime="2026-02-19 19:19:49.370215236 +0000 UTC m=+88.982565580" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.382705 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podStartSLOduration=69.382679162 podStartE2EDuration="1m9.382679162s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.382261889 +0000 UTC m=+88.994612243" watchObservedRunningTime="2026-02-19 19:19:49.382679162 +0000 UTC m=+88.995029486" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.395902 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jnvgg" podStartSLOduration=69.395887492 podStartE2EDuration="1m9.395887492s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.395728917 +0000 UTC m=+89.008079251" watchObservedRunningTime="2026-02-19 19:19:49.395887492 +0000 UTC m=+89.008237816" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.426394 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.426378419 podStartE2EDuration="39.426378419s" podCreationTimestamp="2026-02-19 19:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.425902575 +0000 UTC m=+89.038252949" watchObservedRunningTime="2026-02-19 19:19:49.426378419 +0000 UTC m=+89.038728743" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.442171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.451941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.461280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.493809 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" podStartSLOduration=69.493791533 podStartE2EDuration="1m9.493791533s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.492565286 +0000 UTC m=+89.104915610" watchObservedRunningTime="2026-02-19 19:19:49.493791533 +0000 UTC m=+89.106141857" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.494066 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lwpgw" podStartSLOduration=69.494060052 podStartE2EDuration="1m9.494060052s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.471705547 +0000 UTC m=+89.084055871" watchObservedRunningTime="2026-02-19 19:19:49.494060052 +0000 UTC m=+89.106410376" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.518519 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" podStartSLOduration=68.518498951 podStartE2EDuration="1m8.518498951s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.505989492 +0000 UTC m=+89.118339836" watchObservedRunningTime="2026-02-19 19:19:49.518498951 +0000 UTC m=+89.130849285" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.539047 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.539027968 podStartE2EDuration="12.539027968s" podCreationTimestamp="2026-02-19 19:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.537800851 +0000 UTC m=+89.150151195" watchObservedRunningTime="2026-02-19 19:19:49.539027968 +0000 UTC m=+89.151378302" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.595015 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: W0219 19:19:49.610977 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e0afb8_cdad_45eb_a49f_e9f6ca11ec1b.slice/crio-5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064 WatchSource:0}: Error finding container 5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064: Status 404 returned error can't find the container with id 5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064 Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.070679 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.070699 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.070702 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.071324 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:52:55.760042728 +0000 UTC Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.071697 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 19:19:50 crc kubenswrapper[4722]: E0219 19:19:50.073014 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:50 crc kubenswrapper[4722]: E0219 19:19:50.073405 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:50 crc kubenswrapper[4722]: E0219 19:19:50.073687 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.082107 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.590795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" event={"ID":"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b","Type":"ContainerStarted","Data":"57bb636fc11755ab566eb5dcb02b3dc00443892c8a61fd8931403141cf1eb485"} Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.590852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" event={"ID":"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b","Type":"ContainerStarted","Data":"5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064"} Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.605046 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" podStartSLOduration=70.60502861 podStartE2EDuration="1m10.60502861s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:50.604401761 +0000 UTC m=+90.216752095" watchObservedRunningTime="2026-02-19 19:19:50.60502861 +0000 UTC m=+90.217378934" Feb 19 19:19:51 crc kubenswrapper[4722]: I0219 19:19:51.070646 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:51 crc kubenswrapper[4722]: E0219 19:19:51.071684 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:52 crc kubenswrapper[4722]: I0219 19:19:52.071044 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:52 crc kubenswrapper[4722]: I0219 19:19:52.071078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:52 crc kubenswrapper[4722]: E0219 19:19:52.071205 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:52 crc kubenswrapper[4722]: I0219 19:19:52.071221 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:52 crc kubenswrapper[4722]: E0219 19:19:52.071293 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:52 crc kubenswrapper[4722]: E0219 19:19:52.071380 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:53 crc kubenswrapper[4722]: I0219 19:19:53.071095 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:53 crc kubenswrapper[4722]: E0219 19:19:53.071572 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:54 crc kubenswrapper[4722]: I0219 19:19:54.070734 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:54 crc kubenswrapper[4722]: I0219 19:19:54.070817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:54 crc kubenswrapper[4722]: I0219 19:19:54.070743 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:54 crc kubenswrapper[4722]: E0219 19:19:54.070911 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:54 crc kubenswrapper[4722]: E0219 19:19:54.071005 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:54 crc kubenswrapper[4722]: E0219 19:19:54.071141 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:55 crc kubenswrapper[4722]: I0219 19:19:55.071128 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:55 crc kubenswrapper[4722]: E0219 19:19:55.071886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:55 crc kubenswrapper[4722]: I0219 19:19:55.091631 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.071138 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.071288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.071340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.071501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.071533 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.072052 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.072536 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.072759 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:57 crc kubenswrapper[4722]: I0219 19:19:57.070403 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:57 crc kubenswrapper[4722]: E0219 19:19:57.070622 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:58 crc kubenswrapper[4722]: I0219 19:19:58.071232 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:58 crc kubenswrapper[4722]: I0219 19:19:58.071265 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:58 crc kubenswrapper[4722]: I0219 19:19:58.071340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:58 crc kubenswrapper[4722]: E0219 19:19:58.072432 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:58 crc kubenswrapper[4722]: E0219 19:19:58.072698 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:58 crc kubenswrapper[4722]: E0219 19:19:58.072758 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:59 crc kubenswrapper[4722]: I0219 19:19:59.070940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:59 crc kubenswrapper[4722]: E0219 19:19:59.071137 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:59 crc kubenswrapper[4722]: I0219 19:19:59.151581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:59 crc kubenswrapper[4722]: E0219 19:19:59.151765 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:59 crc kubenswrapper[4722]: E0219 19:19:59.152362 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:21:03.152325496 +0000 UTC m=+162.764675850 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:20:00 crc kubenswrapper[4722]: I0219 19:20:00.070685 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:00 crc kubenswrapper[4722]: I0219 19:20:00.070746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:00 crc kubenswrapper[4722]: I0219 19:20:00.070748 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:00 crc kubenswrapper[4722]: E0219 19:20:00.070900 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:00 crc kubenswrapper[4722]: E0219 19:20:00.071063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:00 crc kubenswrapper[4722]: E0219 19:20:00.071388 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:01 crc kubenswrapper[4722]: I0219 19:20:01.072448 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:01 crc kubenswrapper[4722]: E0219 19:20:01.072578 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:01 crc kubenswrapper[4722]: I0219 19:20:01.108801 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.108772576 podStartE2EDuration="6.108772576s" podCreationTimestamp="2026-02-19 19:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:01.105958869 +0000 UTC m=+100.718309243" watchObservedRunningTime="2026-02-19 19:20:01.108772576 +0000 UTC m=+100.721122940" Feb 19 19:20:02 crc kubenswrapper[4722]: I0219 19:20:02.071120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:02 crc kubenswrapper[4722]: I0219 19:20:02.071255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:02 crc kubenswrapper[4722]: I0219 19:20:02.071181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:02 crc kubenswrapper[4722]: E0219 19:20:02.071358 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:02 crc kubenswrapper[4722]: E0219 19:20:02.071493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:02 crc kubenswrapper[4722]: E0219 19:20:02.071660 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:03 crc kubenswrapper[4722]: I0219 19:20:03.071115 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:03 crc kubenswrapper[4722]: E0219 19:20:03.071397 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:04 crc kubenswrapper[4722]: I0219 19:20:04.070612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:04 crc kubenswrapper[4722]: I0219 19:20:04.070657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:04 crc kubenswrapper[4722]: I0219 19:20:04.070612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:04 crc kubenswrapper[4722]: E0219 19:20:04.070808 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:04 crc kubenswrapper[4722]: E0219 19:20:04.070994 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:04 crc kubenswrapper[4722]: E0219 19:20:04.071073 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:05 crc kubenswrapper[4722]: I0219 19:20:05.070565 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:05 crc kubenswrapper[4722]: E0219 19:20:05.070712 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:06 crc kubenswrapper[4722]: I0219 19:20:06.071144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:06 crc kubenswrapper[4722]: I0219 19:20:06.071200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:06 crc kubenswrapper[4722]: I0219 19:20:06.071277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:06 crc kubenswrapper[4722]: E0219 19:20:06.071463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:06 crc kubenswrapper[4722]: E0219 19:20:06.071595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:06 crc kubenswrapper[4722]: E0219 19:20:06.071730 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:07 crc kubenswrapper[4722]: I0219 19:20:07.070770 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:07 crc kubenswrapper[4722]: E0219 19:20:07.071699 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:07 crc kubenswrapper[4722]: I0219 19:20:07.071747 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:20:07 crc kubenswrapper[4722]: E0219 19:20:07.072810 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:20:08 crc kubenswrapper[4722]: I0219 19:20:08.071106 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:08 crc kubenswrapper[4722]: I0219 19:20:08.071097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:08 crc kubenswrapper[4722]: I0219 19:20:08.071097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:08 crc kubenswrapper[4722]: E0219 19:20:08.071528 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:08 crc kubenswrapper[4722]: E0219 19:20:08.071704 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:08 crc kubenswrapper[4722]: E0219 19:20:08.072034 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:09 crc kubenswrapper[4722]: I0219 19:20:09.070429 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:09 crc kubenswrapper[4722]: E0219 19:20:09.070673 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:10 crc kubenswrapper[4722]: I0219 19:20:10.070570 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:10 crc kubenswrapper[4722]: E0219 19:20:10.070736 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:10 crc kubenswrapper[4722]: I0219 19:20:10.070974 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:10 crc kubenswrapper[4722]: E0219 19:20:10.071060 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:10 crc kubenswrapper[4722]: I0219 19:20:10.070567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:10 crc kubenswrapper[4722]: E0219 19:20:10.071364 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:11 crc kubenswrapper[4722]: I0219 19:20:11.070504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:11 crc kubenswrapper[4722]: E0219 19:20:11.072669 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:12 crc kubenswrapper[4722]: I0219 19:20:12.071363 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:12 crc kubenswrapper[4722]: I0219 19:20:12.071418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:12 crc kubenswrapper[4722]: I0219 19:20:12.071526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:12 crc kubenswrapper[4722]: E0219 19:20:12.071744 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:12 crc kubenswrapper[4722]: E0219 19:20:12.071868 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:12 crc kubenswrapper[4722]: E0219 19:20:12.071948 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:13 crc kubenswrapper[4722]: I0219 19:20:13.071123 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:13 crc kubenswrapper[4722]: E0219 19:20:13.071731 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:14 crc kubenswrapper[4722]: I0219 19:20:14.070713 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:14 crc kubenswrapper[4722]: I0219 19:20:14.070798 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:14 crc kubenswrapper[4722]: E0219 19:20:14.070851 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:14 crc kubenswrapper[4722]: I0219 19:20:14.070883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:14 crc kubenswrapper[4722]: E0219 19:20:14.071063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:14 crc kubenswrapper[4722]: E0219 19:20:14.071219 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.071440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:15 crc kubenswrapper[4722]: E0219 19:20:15.071612 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.677579 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678483 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/0.log" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678561 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" exitCode=1 Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerDied","Data":"38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16"} Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678672 4722 scope.go:117] "RemoveContainer" containerID="5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.679343 4722 scope.go:117] "RemoveContainer" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" Feb 19 19:20:15 crc kubenswrapper[4722]: E0219 19:20:15.679626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jnvgg_openshift-multus(7a80fcd7-8ac4-4e82-8f14-93d225898bb5)\"" pod="openshift-multus/multus-jnvgg" podUID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.070546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.070557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:16 crc kubenswrapper[4722]: E0219 19:20:16.070823 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.070608 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:16 crc kubenswrapper[4722]: E0219 19:20:16.071004 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:16 crc kubenswrapper[4722]: E0219 19:20:16.071267 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.684779 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:20:17 crc kubenswrapper[4722]: I0219 19:20:17.070738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:17 crc kubenswrapper[4722]: E0219 19:20:17.070949 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:18 crc kubenswrapper[4722]: I0219 19:20:18.070911 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:18 crc kubenswrapper[4722]: E0219 19:20:18.071139 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:18 crc kubenswrapper[4722]: I0219 19:20:18.071213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:18 crc kubenswrapper[4722]: I0219 19:20:18.071257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:18 crc kubenswrapper[4722]: E0219 19:20:18.071401 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:18 crc kubenswrapper[4722]: E0219 19:20:18.071526 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:19 crc kubenswrapper[4722]: I0219 19:20:19.071432 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:19 crc kubenswrapper[4722]: E0219 19:20:19.071633 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:20 crc kubenswrapper[4722]: I0219 19:20:20.070568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:20 crc kubenswrapper[4722]: I0219 19:20:20.070649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:20 crc kubenswrapper[4722]: E0219 19:20:20.070772 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:20 crc kubenswrapper[4722]: E0219 19:20:20.070987 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:20 crc kubenswrapper[4722]: I0219 19:20:20.071011 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:20 crc kubenswrapper[4722]: E0219 19:20:20.071267 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.036514 4722 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.073993 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.074248 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.074607 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.167584 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.702196 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.709142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.709544 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.886576 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podStartSLOduration=100.886557215 podStartE2EDuration="1m40.886557215s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:21.756468324 +0000 UTC m=+121.368818668" watchObservedRunningTime="2026-02-19 19:20:21.886557215 +0000 UTC m=+121.498907539" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.887174 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s6hhp"] Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.887295 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.887449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:22 crc kubenswrapper[4722]: I0219 19:20:22.070915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:22 crc kubenswrapper[4722]: E0219 19:20:22.071056 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:22 crc kubenswrapper[4722]: I0219 19:20:22.070945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:22 crc kubenswrapper[4722]: E0219 19:20:22.071136 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:22 crc kubenswrapper[4722]: I0219 19:20:22.070932 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:22 crc kubenswrapper[4722]: E0219 19:20:22.071226 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.070486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070554 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.070764 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.070957 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.071114 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070588 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.070720 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.070793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.071007 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.070840 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.168823 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.070672 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.070686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.071373 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.070755 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.071437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.071551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.071835 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.072370 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:29 crc kubenswrapper[4722]: I0219 19:20:29.071504 4722 scope.go:117] "RemoveContainer" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" Feb 19 19:20:29 crc kubenswrapper[4722]: I0219 19:20:29.746516 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:20:29 crc kubenswrapper[4722]: I0219 19:20:29.746595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2"} Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070237 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070363 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070543 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070691 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070773 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070574 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.075878 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.075921 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.075985 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.076432 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.076723 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.077186 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.684503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.726888 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.727720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.728057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.728390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.729316 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.729633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.730751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glfz9"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731222 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731596 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731617 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732015 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732092 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732178 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732279 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732185 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732414 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732978 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732993 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.733184 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.735042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.735297 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.737135 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z8gcw"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.737603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.742966 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.743804 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.744243 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.744400 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746124 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bg6mf"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746878 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746973 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.747061 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.747216 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.747393 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.752361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.752372 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.753658 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.753893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.754092 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.754293 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.754686 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.756446 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.756721 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.756923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757065 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757254 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757834 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.758337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.758820 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lg2rd"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759121 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759501 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759572 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759741 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.760038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759858 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.774114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.774638 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.816187 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.816619 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.817287 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.817392 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.818885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.819425 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.820534 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821552 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821865 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.823905 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.823934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824210 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824362 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824481 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824589 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.825351 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.825396 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h4zk8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.825745 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826046 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826392 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826620 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckm69\" (UniqueName: \"kubernetes.io/projected/bf8b7b84-382a-410f-8dea-c4f485402a77-kube-api-access-ckm69\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb6fj\" (UniqueName: \"kubernetes.io/projected/a7788f82-4e6b-4d89-b009-0eca5b234009-kube-api-access-rb6fj\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit-dir\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826980 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-encryption-config\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.828606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-images\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjcl\" (UniqueName: \"kubernetes.io/projected/47339628-7112-4f7a-b949-fef983428ebe-kube-api-access-9jjcl\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-node-pullsecrets\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxd7\" (UniqueName: \"kubernetes.io/projected/b7b80c35-8f0b-4f44-af31-0b84ebddd4b8-kube-api-access-nbxd7\") pod \"downloads-7954f5f757-lg2rd\" (UID: \"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8\") " pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.830806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2bj\" (UniqueName: \"kubernetes.io/projected/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-kube-api-access-tt2bj\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831324 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnn2s\" (UniqueName: \"kubernetes.io/projected/8c255c5e-d6d9-4772-9151-0065df6dc00d-kube-api-access-qnn2s\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831397 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831460 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nzgmv"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.827122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831453 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-serving-cert\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-auth-proxy-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-client\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-policies\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-encryption-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47339628-7112-4f7a-b949-fef983428ebe-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.828276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-client\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832171 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-config\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832559 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832838 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833170 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833543 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-dir\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834743 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-machine-approver-tls\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834821 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-image-import-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rcx\" (UniqueName: \"kubernetes.io/projected/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-kube-api-access-j8rcx\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcgd\" (UniqueName: \"kubernetes.io/projected/26779d4b-27e7-4bac-a4d8-5c312a6cec13-kube-api-access-cxcgd\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7788f82-4e6b-4d89-b009-0eca5b234009-serving-cert\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-service-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26779d4b-27e7-4bac-a4d8-5c312a6cec13-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26779d4b-27e7-4bac-a4d8-5c312a6cec13-config\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-serving-cert\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835208 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/47339628-7112-4f7a-b949-fef983428ebe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-config\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835326 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835363 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8b7b84-382a-410f-8dea-c4f485402a77-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835467 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.836422 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.836498 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.837109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.837263 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.841917 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.841978 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.842131 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.842256 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.842376 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.847676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.847929 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.852981 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.853170 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p99c4"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.853621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.870048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886680 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886805 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886952 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.887911 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.888055 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.889204 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.889292 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.889817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.890189 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.890615 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891325 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.892395 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.892722 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.892959 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893132 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893090 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893345 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893462 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893570 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893679 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gtjsk"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893299 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893935 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894283 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893712 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894490 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894604 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894058 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894739 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893860 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893755 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.897711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.898740 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.900391 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.900861 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.901030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.901501 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.903342 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72z7j"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904038 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904744 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.905217 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.908184 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.908690 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.909259 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.909499 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.913276 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.913733 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.914102 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.914350 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.914767 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.915428 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.915635 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.919248 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.919835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.920039 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.920664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.924262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.924293 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z8gcw"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.942138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h4zk8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.942273 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/47339628-7112-4f7a-b949-fef983428ebe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-config\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943376 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-config\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8b7b84-382a-410f-8dea-c4f485402a77-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8t8\" (UniqueName: \"kubernetes.io/projected/51679292-9818-418a-98d6-c442dc7d28e2-kube-api-access-px8t8\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit-dir\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943938 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckm69\" (UniqueName: \"kubernetes.io/projected/bf8b7b84-382a-410f-8dea-c4f485402a77-kube-api-access-ckm69\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb6fj\" (UniqueName: \"kubernetes.io/projected/a7788f82-4e6b-4d89-b009-0eca5b234009-kube-api-access-rb6fj\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944004 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7btmm\" (UniqueName: \"kubernetes.io/projected/2d21a014-83a9-43d9-9cdd-5e0897757c90-kube-api-access-7btmm\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-encryption-config\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944057 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjcl\" (UniqueName: \"kubernetes.io/projected/47339628-7112-4f7a-b949-fef983428ebe-kube-api-access-9jjcl\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-images\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01bb1078-2d76-42f4-919f-3d1b73a61fd4-metrics-tls\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdwp\" (UniqueName: \"kubernetes.io/projected/01bb1078-2d76-42f4-919f-3d1b73a61fd4-kube-api-access-4wdwp\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j49\" (UniqueName: \"kubernetes.io/projected/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-kube-api-access-47j49\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkngc\" (UniqueName: \"kubernetes.io/projected/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-kube-api-access-vkngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-node-pullsecrets\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxd7\" (UniqueName: \"kubernetes.io/projected/b7b80c35-8f0b-4f44-af31-0b84ebddd4b8-kube-api-access-nbxd7\") pod \"downloads-7954f5f757-lg2rd\" (UID: \"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8\") " pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmbgg\" (UniqueName: \"kubernetes.io/projected/7c9da917-db10-4eba-bdff-f68354e8d4a6-kube-api-access-lmbgg\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38df6625-e726-49e8-9bff-561442dcea53-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2bj\" (UniqueName: \"kubernetes.io/projected/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-kube-api-access-tt2bj\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnn2s\" (UniqueName: \"kubernetes.io/projected/8c255c5e-d6d9-4772-9151-0065df6dc00d-kube-api-access-qnn2s\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944877 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-serving-cert\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-etcd-client\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-auth-proxy-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d21a014-83a9-43d9-9cdd-5e0897757c90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945252 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945283 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-client\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-policies\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945423 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-encryption-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945445 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38df6625-e726-49e8-9bff-561442dcea53-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47339628-7112-4f7a-b949-fef983428ebe-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-serving-cert\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc64p\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-kube-api-access-mc64p\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-service-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-client\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-config\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945933 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946017 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-dir\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-machine-approver-tls\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946169 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-image-import-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c9da917-db10-4eba-bdff-f68354e8d4a6-tmpfs\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946379 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rcx\" (UniqueName: \"kubernetes.io/projected/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-kube-api-access-j8rcx\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946599 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcgd\" (UniqueName: \"kubernetes.io/projected/26779d4b-27e7-4bac-a4d8-5c312a6cec13-kube-api-access-cxcgd\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7788f82-4e6b-4d89-b009-0eca5b234009-serving-cert\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-service-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26779d4b-27e7-4bac-a4d8-5c312a6cec13-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26779d4b-27e7-4bac-a4d8-5c312a6cec13-config\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946809 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-serving-cert\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.947710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26779d4b-27e7-4bac-a4d8-5c312a6cec13-config\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949107 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949189 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949206 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hxzjr"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/47339628-7112-4f7a-b949-fef983428ebe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.950201 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.950680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-service-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.979487 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.982351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.983142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.985493 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-config\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.989657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.990841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.991578 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26779d4b-27e7-4bac-a4d8-5c312a6cec13-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.991886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.992190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.993927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.999521 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.995966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-policies\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:40.950850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-node-pullsecrets\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:40.985790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-auth-proxy-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:40.950946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit-dir\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008520 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8b7b84-382a-410f-8dea-c4f485402a77-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-serving-cert\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009454 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009516 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-image-import-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7788f82-4e6b-4d89-b009-0eca5b234009-serving-cert\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.010411 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.010774 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.011067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-encryption-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.011806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-images\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.012238 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-serving-cert\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.000756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-dir\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015143 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015260 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glfz9"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-machine-approver-tls\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-config\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.016137 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-encryption-config\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.016942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-client\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.016995 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.018920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-client\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.020489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.020698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47339628-7112-4f7a-b949-fef983428ebe-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.020756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.021338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.021660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022054 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8ppnm"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.023738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.023781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.024020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.025127 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.026526 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.026543 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.027781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72z7j"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.029198 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nm78h"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.029777 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.030060 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.030441 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lg2rd"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.031279 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.032464 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.033474 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.034432 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.035591 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.036361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gtjsk"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.037336 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.038270 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vcmxn"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.039334 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.039427 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.040307 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.041275 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.042230 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.043278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.044606 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.045773 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.046888 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bg6mf"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047343 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-config\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8t8\" (UniqueName: \"kubernetes.io/projected/51679292-9818-418a-98d6-c442dc7d28e2-kube-api-access-px8t8\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7btmm\" (UniqueName: \"kubernetes.io/projected/2d21a014-83a9-43d9-9cdd-5e0897757c90-kube-api-access-7btmm\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01bb1078-2d76-42f4-919f-3d1b73a61fd4-metrics-tls\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdwp\" (UniqueName: \"kubernetes.io/projected/01bb1078-2d76-42f4-919f-3d1b73a61fd4-kube-api-access-4wdwp\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47j49\" (UniqueName: \"kubernetes.io/projected/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-kube-api-access-47j49\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkngc\" (UniqueName: \"kubernetes.io/projected/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-kube-api-access-vkngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmbgg\" (UniqueName: \"kubernetes.io/projected/7c9da917-db10-4eba-bdff-f68354e8d4a6-kube-api-access-lmbgg\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38df6625-e726-49e8-9bff-561442dcea53-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047757 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-etcd-client\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d21a014-83a9-43d9-9cdd-5e0897757c90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047873 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38df6625-e726-49e8-9bff-561442dcea53-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047895 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-serving-cert\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc64p\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-kube-api-access-mc64p\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047976 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-service-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c9da917-db10-4eba-bdff-f68354e8d4a6-tmpfs\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c9da917-db10-4eba-bdff-f68354e8d4a6-tmpfs\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.049370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.049539 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.050957 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8ppnm"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.051908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d21a014-83a9-43d9-9cdd-5e0897757c90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.051999 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vcmxn"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.053060 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p99c4"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.054500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.055553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.056568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nm78h"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.057547 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.058526 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.059607 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kqs9s"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.060490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.060570 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kqs9s"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.069945 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.090347 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.119162 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.130786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.150051 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.169824 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.190649 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.202345 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38df6625-e726-49e8-9bff-561442dcea53-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.210013 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.235842 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.239116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38df6625-e726-49e8-9bff-561442dcea53-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.249830 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.270566 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.290250 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.301032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-serving-cert\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.311908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.318745 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.330612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.349980 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.358934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-config\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.369361 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-etcd-client\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.370579 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.389822 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.398759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-service-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.410224 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.430071 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.450841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.472113 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.490112 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.509935 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.550601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.569858 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.590595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.611385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.632672 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.650379 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.671065 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.683296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01bb1078-2d76-42f4-919f-3d1b73a61fd4-metrics-tls\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.691008 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.710292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.731316 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.750636 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.771637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.791872 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.798734 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.798805 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.811177 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.830688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.850039 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.871520 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.890265 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.908899 4722 request.go:700] Waited for 1.003806087s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.911113 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.930888 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.951022 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.970494 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.990368 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.001341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.010189 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.030049 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047610 4722 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047675 4722 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047698 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca podName:cb6886b7-9193-4c89-96c8-64b61c3251a4 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547672135 +0000 UTC m=+142.160022499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca") pod "marketplace-operator-79b997595-4gbkr" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4") : failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047799 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert podName:c02c0f7a-9c0e-4d91-aca7-9648bace7d2f nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547771598 +0000 UTC m=+142.160121962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-5xkn2" (UID: "c02c0f7a-9c0e-4d91-aca7-9648bace7d2f") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047837 4722 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047882 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert podName:e8ae2d71-7578-4343-a1ba-5d414cd1cc4b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547865951 +0000 UTC m=+142.160216315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert") pod "olm-operator-6b444d44fb-cjtjp" (UID: "e8ae2d71-7578-4343-a1ba-5d414cd1cc4b") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047910 4722 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047952 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics podName:cb6886b7-9193-4c89-96c8-64b61c3251a4 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547940273 +0000 UTC m=+142.160290627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics") pod "marketplace-operator-79b997595-4gbkr" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047970 4722 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048033 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert podName:7c9da917-db10-4eba-bdff-f68354e8d4a6 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.548016636 +0000 UTC m=+142.160366990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert") pod "packageserver-d55dfcdfc-klvwp" (UID: "7c9da917-db10-4eba-bdff-f68354e8d4a6") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048879 4722 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048910 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config podName:c02c0f7a-9c0e-4d91-aca7-9648bace7d2f nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.548899954 +0000 UTC m=+142.161250278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config") pod "kube-storage-version-migrator-operator-b67b599dd-5xkn2" (UID: "c02c0f7a-9c0e-4d91-aca7-9648bace7d2f") : failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048923 4722 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048991 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert podName:7c9da917-db10-4eba-bdff-f68354e8d4a6 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.548975577 +0000 UTC m=+142.161325941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert") pod "packageserver-d55dfcdfc-klvwp" (UID: "7c9da917-db10-4eba-bdff-f68354e8d4a6") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.050957 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.069848 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.090637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.111328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.130699 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.160483 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.170550 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.191132 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.211583 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.230846 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.250710 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.270854 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.290857 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.311648 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.330759 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.350359 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.370439 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.390339 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.409922 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.489443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rcx\" (UniqueName: \"kubernetes.io/projected/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-kube-api-access-j8rcx\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.506103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcgd\" (UniqueName: \"kubernetes.io/projected/26779d4b-27e7-4bac-a4d8-5c312a6cec13-kube-api-access-cxcgd\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.511064 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.511467 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.528840 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.530948 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.551895 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 19:20:42 crc kubenswrapper[4722]: W0219 19:20:42.554402 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd33763f9_ec4f_4337_b9d4_f5c25ec6eabc.slice/crio-8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43 WatchSource:0}: Error finding container 8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43: Status 404 returned error can't find the container with id 8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43 Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566334 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.568530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.568606 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.569998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.570966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.571326 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.572573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.573279 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.589065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2bj\" (UniqueName: \"kubernetes.io/projected/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-kube-api-access-tt2bj\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.615060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckm69\" (UniqueName: \"kubernetes.io/projected/bf8b7b84-382a-410f-8dea-c4f485402a77-kube-api-access-ckm69\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.625034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.625555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb6fj\" (UniqueName: \"kubernetes.io/projected/a7788f82-4e6b-4d89-b009-0eca5b234009-kube-api-access-rb6fj\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.641966 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.647370 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxd7\" (UniqueName: \"kubernetes.io/projected/b7b80c35-8f0b-4f44-af31-0b84ebddd4b8-kube-api-access-nbxd7\") pod \"downloads-7954f5f757-lg2rd\" (UID: \"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8\") " pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.674266 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.679050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjcl\" (UniqueName: \"kubernetes.io/projected/47339628-7112-4f7a-b949-fef983428ebe-kube-api-access-9jjcl\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.693105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.708884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnn2s\" (UniqueName: \"kubernetes.io/projected/8c255c5e-d6d9-4772-9151-0065df6dc00d-kube-api-access-qnn2s\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.719441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.727543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.731586 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.749136 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.750953 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.772625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.791110 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.810390 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.821361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.830338 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.835277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.838363 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" event={"ID":"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc","Type":"ContainerStarted","Data":"8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43"} Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.842672 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.844517 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq"] Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.850895 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.857062 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:42 crc kubenswrapper[4722]: W0219 19:20:42.858313 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5693eddb_45a4_4cee_acb8_d3c0f23d16b8.slice/crio-f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839 WatchSource:0}: Error finding container f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839: Status 404 returned error can't find the container with id f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839 Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.871623 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.873348 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp"] Feb 19 19:20:42 crc kubenswrapper[4722]: W0219 19:20:42.892762 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26779d4b_27e7_4bac_a4d8_5c312a6cec13.slice/crio-bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb WatchSource:0}: Error finding container bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb: Status 404 returned error can't find the container with id bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.892963 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.910355 4722 request.go:700] Waited for 1.870719484s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.913116 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.932580 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.945261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.952143 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.990564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.991426 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.992887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.002364 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glfz9"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.007682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.030085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7btmm\" (UniqueName: \"kubernetes.io/projected/2d21a014-83a9-43d9-9cdd-5e0897757c90-kube-api-access-7btmm\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.047956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j49\" (UniqueName: \"kubernetes.io/projected/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-kube-api-access-47j49\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.065249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdwp\" (UniqueName: \"kubernetes.io/projected/01bb1078-2d76-42f4-919f-3d1b73a61fd4-kube-api-access-4wdwp\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.087286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8t8\" (UniqueName: \"kubernetes.io/projected/51679292-9818-418a-98d6-c442dc7d28e2-kube-api-access-px8t8\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.118083 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkngc\" (UniqueName: \"kubernetes.io/projected/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-kube-api-access-vkngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.135655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmbgg\" (UniqueName: \"kubernetes.io/projected/7c9da917-db10-4eba-bdff-f68354e8d4a6-kube-api-access-lmbgg\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.149698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.150872 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.152322 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bg6mf"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.153623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc64p\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-kube-api-access-mc64p\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.165652 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c255c5e_d6d9_4772_9151_0065df6dc00d.slice/crio-accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868 WatchSource:0}: Error finding container accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868: Status 404 returned error can't find the container with id accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.174643 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.196329 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.206543 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.233721 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.284298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ea57e2-def2-4a73-a86b-75be99e36e46-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00e2406-a55b-4e28-bed9-a060b0780301-serving-cert\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a159fd13-de0a-46d3-971f-fb7c2fc652bd-config\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290208 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669833fe-83b5-4d4a-a78c-c360789f754b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290285 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbbxv\" (UniqueName: \"kubernetes.io/projected/2d33e000-0a81-4601-8120-52dacf0b5d6b-kube-api-access-zbbxv\") pod \"migrator-59844c95c7-9sqrc\" (UID: \"2d33e000-0a81-4601-8120-52dacf0b5d6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290325 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e040b47b-3688-40e2-a410-0dfa43ad8ef3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290378 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-metrics-certs\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-proxy-tls\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/809781cd-b87f-423a-957c-0d20e074306e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwxck\" (UniqueName: \"kubernetes.io/projected/e040b47b-3688-40e2-a410-0dfa43ad8ef3-kube-api-access-mwxck\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zg7\" (UniqueName: \"kubernetes.io/projected/f00e2406-a55b-4e28-bed9-a060b0780301-kube-api-access-h7zg7\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290568 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzk45\" (UniqueName: \"kubernetes.io/projected/a159fd13-de0a-46d3-971f-fb7c2fc652bd-kube-api-access-zzk45\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt4vb\" (UniqueName: \"kubernetes.io/projected/41fade82-0d8d-41b2-805e-8a92ffa97cf3-kube-api-access-rt4vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a159fd13-de0a-46d3-971f-fb7c2fc652bd-serving-cert\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-default-certificate\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290746 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669833fe-83b5-4d4a-a78c-c360789f754b-config\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290773 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xdq\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-kube-api-access-r9xdq\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290789 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-images\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv46w\" (UniqueName: \"kubernetes.io/projected/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-kube-api-access-vv46w\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290841 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gjx\" (UniqueName: \"kubernetes.io/projected/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-kube-api-access-88gjx\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3ea57e2-def2-4a73-a86b-75be99e36e46-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/41fade82-0d8d-41b2-805e-8a92ffa97cf3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-srv-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-stats-auth\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-trusted-ca\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071e162-d262-4732-81ca-10bb9b507321-service-ca-bundle\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809781cd-b87f-423a-957c-0d20e074306e-config\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291072 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/669833fe-83b5-4d4a-a78c-c360789f754b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8l4m\" (UniqueName: \"kubernetes.io/projected/b3ea57e2-def2-4a73-a86b-75be99e36e46-kube-api-access-j8l4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-config\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291210 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwb2\" (UniqueName: \"kubernetes.io/projected/3071e162-d262-4732-81ca-10bb9b507321-kube-api-access-9vwb2\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/809781cd-b87f-423a-957c-0d20e074306e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-proxy-tls\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgwl\" (UniqueName: \"kubernetes.io/projected/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-kube-api-access-zsgwl\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mt8\" (UniqueName: \"kubernetes.io/projected/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-kube-api-access-t9mt8\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.297519 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.797502366 +0000 UTC m=+143.409852760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.299637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.301030 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z8gcw"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.307514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.316821 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.392823 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.392996 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.892971166 +0000 UTC m=+143.505321490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-mountpoint-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-socket-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-metrics-certs\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393345 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-proxy-tls\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/809781cd-b87f-423a-957c-0d20e074306e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwxck\" (UniqueName: \"kubernetes.io/projected/e040b47b-3688-40e2-a410-0dfa43ad8ef3-kube-api-access-mwxck\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zg7\" (UniqueName: \"kubernetes.io/projected/f00e2406-a55b-4e28-bed9-a060b0780301-kube-api-access-h7zg7\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-key\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-node-bootstrap-token\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzk45\" (UniqueName: \"kubernetes.io/projected/a159fd13-de0a-46d3-971f-fb7c2fc652bd-kube-api-access-zzk45\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt4vb\" (UniqueName: \"kubernetes.io/projected/41fade82-0d8d-41b2-805e-8a92ffa97cf3-kube-api-access-rt4vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a159fd13-de0a-46d3-971f-fb7c2fc652bd-serving-cert\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-default-certificate\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393788 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669833fe-83b5-4d4a-a78c-c360789f754b-config\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393818 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xdq\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-kube-api-access-r9xdq\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-images\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv46w\" (UniqueName: \"kubernetes.io/projected/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-kube-api-access-vv46w\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394311 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3ea57e2-def2-4a73-a86b-75be99e36e46-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-cabundle\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394417 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88gjx\" (UniqueName: \"kubernetes.io/projected/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-kube-api-access-88gjx\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/41fade82-0d8d-41b2-805e-8a92ffa97cf3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86s9\" (UniqueName: \"kubernetes.io/projected/71548ff6-f831-48ba-af51-99fe431c447a-kube-api-access-l86s9\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjsx\" (UniqueName: \"kubernetes.io/projected/814d776b-73c6-4354-8195-da5d3ea2d5cb-kube-api-access-xvjsx\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtkgt\" (UniqueName: \"kubernetes.io/projected/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-kube-api-access-vtkgt\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-srv-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-stats-auth\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-trusted-ca\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071e162-d262-4732-81ca-10bb9b507321-service-ca-bundle\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394629 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809781cd-b87f-423a-957c-0d20e074306e-config\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8l4m\" (UniqueName: \"kubernetes.io/projected/b3ea57e2-def2-4a73-a86b-75be99e36e46-kube-api-access-j8l4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/669833fe-83b5-4d4a-a78c-c360789f754b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394713 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-config\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394730 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394779 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwb2\" (UniqueName: \"kubernetes.io/projected/3071e162-d262-4732-81ca-10bb9b507321-kube-api-access-9vwb2\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/809781cd-b87f-423a-957c-0d20e074306e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2pqg\" (UniqueName: \"kubernetes.io/projected/bb502645-30c6-437d-abc3-28de80105939-kube-api-access-m2pqg\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-registration-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-proxy-tls\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgwl\" (UniqueName: \"kubernetes.io/projected/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-kube-api-access-zsgwl\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mt8\" (UniqueName: \"kubernetes.io/projected/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-kube-api-access-t9mt8\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00e2406-a55b-4e28-bed9-a060b0780301-serving-cert\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ea57e2-def2-4a73-a86b-75be99e36e46-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a159fd13-de0a-46d3-971f-fb7c2fc652bd-config\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395084 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-certs\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395104 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-csi-data-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71548ff6-f831-48ba-af51-99fe431c447a-config-volume\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82lfl\" (UniqueName: \"kubernetes.io/projected/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-kube-api-access-82lfl\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395310 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71548ff6-f831-48ba-af51-99fe431c447a-metrics-tls\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395328 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669833fe-83b5-4d4a-a78c-c360789f754b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395379 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-cert\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbbxv\" (UniqueName: \"kubernetes.io/projected/2d33e000-0a81-4601-8120-52dacf0b5d6b-kube-api-access-zbbxv\") pod \"migrator-59844c95c7-9sqrc\" (UID: \"2d33e000-0a81-4601-8120-52dacf0b5d6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e040b47b-3688-40e2-a410-0dfa43ad8ef3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395504 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-plugins-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.396694 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-images\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.396878 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.89686789 +0000 UTC m=+143.509218214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.398224 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.398434 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.399738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.401475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ea57e2-def2-4a73-a86b-75be99e36e46-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.402304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.402417 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a159fd13-de0a-46d3-971f-fb7c2fc652bd-config\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.403867 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.405300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.405928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.411377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.411397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-config\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.411787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00e2406-a55b-4e28-bed9-a060b0780301-serving-cert\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.412389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.413132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669833fe-83b5-4d4a-a78c-c360789f754b-config\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.413204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.413636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.416374 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.419249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.419819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-metrics-certs\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.421338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.421742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a159fd13-de0a-46d3-971f-fb7c2fc652bd-serving-cert\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.421997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-proxy-tls\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.422298 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669833fe-83b5-4d4a-a78c-c360789f754b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.425558 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.425836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.425995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-srv-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.426111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e040b47b-3688-40e2-a410-0dfa43ad8ef3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.426173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.426388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809781cd-b87f-423a-957c-0d20e074306e-config\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.428010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-trusted-ca\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.429672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-stats-auth\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.429863 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.432646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3ea57e2-def2-4a73-a86b-75be99e36e46-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.433861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-proxy-tls\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.434772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071e162-d262-4732-81ca-10bb9b507321-service-ca-bundle\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.434816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-default-certificate\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.435843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zg7\" (UniqueName: \"kubernetes.io/projected/f00e2406-a55b-4e28-bed9-a060b0780301-kube-api-access-h7zg7\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.445300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/809781cd-b87f-423a-957c-0d20e074306e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/41fade82-0d8d-41b2-805e-8a92ffa97cf3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.454445 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.456869 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.459496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt4vb\" (UniqueName: \"kubernetes.io/projected/41fade82-0d8d-41b2-805e-8a92ffa97cf3-kube-api-access-rt4vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.460341 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lg2rd"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.467007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwb2\" (UniqueName: \"kubernetes.io/projected/3071e162-d262-4732-81ca-10bb9b507321-kube-api-access-9vwb2\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.482251 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.493668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496413 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-registration-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-certs\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-csi-data-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71548ff6-f831-48ba-af51-99fe431c447a-config-volume\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82lfl\" (UniqueName: \"kubernetes.io/projected/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-kube-api-access-82lfl\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71548ff6-f831-48ba-af51-99fe431c447a-metrics-tls\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-cert\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-plugins-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-mountpoint-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-socket-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-key\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-node-bootstrap-token\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-cabundle\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497282 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86s9\" (UniqueName: \"kubernetes.io/projected/71548ff6-f831-48ba-af51-99fe431c447a-kube-api-access-l86s9\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjsx\" (UniqueName: \"kubernetes.io/projected/814d776b-73c6-4354-8195-da5d3ea2d5cb-kube-api-access-xvjsx\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtkgt\" (UniqueName: \"kubernetes.io/projected/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-kube-api-access-vtkgt\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2pqg\" (UniqueName: \"kubernetes.io/projected/bb502645-30c6-437d-abc3-28de80105939-kube-api-access-m2pqg\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.497395 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.997379481 +0000 UTC m=+143.609729795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-csi-data-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.498026 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-mountpoint-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.498384 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-plugins-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.498807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xdq\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-kube-api-access-r9xdq\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.499015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-registration-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.499059 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-socket-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.499405 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.503304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71548ff6-f831-48ba-af51-99fe431c447a-metrics-tls\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.504738 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.504954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-key\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.507016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-certs\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.507240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-node-bootstrap-token\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.508703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.509393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-cert\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.526004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/809781cd-b87f-423a-957c-0d20e074306e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.533729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gtjsk"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.542273 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71548ff6-f831-48ba-af51-99fe431c447a-config-volume\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.543659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzk45\" (UniqueName: \"kubernetes.io/projected/a159fd13-de0a-46d3-971f-fb7c2fc652bd-kube-api-access-zzk45\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.555436 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.556645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-cabundle\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.572450 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.572873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgwl\" (UniqueName: \"kubernetes.io/projected/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-kube-api-access-zsgwl\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.592945 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc880c8_beb9_4081_8af6_64d2fa857901.slice/crio-e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545 WatchSource:0}: Error finding container e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545: Status 404 returned error can't find the container with id e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545 Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.595144 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3071e162_d262_4732_81ca_10bb9b507321.slice/crio-4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37 WatchSource:0}: Error finding container 4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37: Status 404 returned error can't find the container with id 4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.597882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.601439 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.101406394 +0000 UTC m=+143.713756718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.612491 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv46w\" (UniqueName: \"kubernetes.io/projected/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-kube-api-access-vv46w\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.617086 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.621025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.628271 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.646943 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mt8\" (UniqueName: \"kubernetes.io/projected/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-kube-api-access-t9mt8\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.669462 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.685900 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p99c4"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.686421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.698826 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.699016 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.198989271 +0000 UTC m=+143.811339595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.699049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.699440 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.199432075 +0000 UTC m=+143.811782399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.708873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.721719 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51679292_9818_418a_98d6_c442dc7d28e2.slice/crio-ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9 WatchSource:0}: Error finding container ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9: Status 404 returned error can't find the container with id ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.730761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.747427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/669833fe-83b5-4d4a-a78c-c360789f754b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.761799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.768054 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.769466 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.776855 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbbxv\" (UniqueName: \"kubernetes.io/projected/2d33e000-0a81-4601-8120-52dacf0b5d6b-kube-api-access-zbbxv\") pod \"migrator-59844c95c7-9sqrc\" (UID: \"2d33e000-0a81-4601-8120-52dacf0b5d6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.785792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwxck\" (UniqueName: \"kubernetes.io/projected/e040b47b-3688-40e2-a410-0dfa43ad8ef3-kube-api-access-mwxck\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.796467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.797653 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h4zk8"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.799672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.799857 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.299836522 +0000 UTC m=+143.912186846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.799968 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.800290 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.300282577 +0000 UTC m=+143.912632901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.805488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8l4m\" (UniqueName: \"kubernetes.io/projected/b3ea57e2-def2-4a73-a86b-75be99e36e46-kube-api-access-j8l4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.807960 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.812583 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.822108 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.825958 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.826478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.828772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88gjx\" (UniqueName: \"kubernetes.io/projected/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-kube-api-access-88gjx\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.843789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.845815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82lfl\" (UniqueName: \"kubernetes.io/projected/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-kube-api-access-82lfl\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.849861 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" event={"ID":"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b","Type":"ContainerStarted","Data":"937a76b33078699fdd749624ad5fbc055d6636f974ccd1fc5e83353583659e23"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.853644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" event={"ID":"38df6625-e726-49e8-9bff-561442dcea53","Type":"ContainerStarted","Data":"980a7eef68679d7d7667805ee210f572632e488a6a12ac11df8fce3e620735b4"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.853687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" event={"ID":"38df6625-e726-49e8-9bff-561442dcea53","Type":"ContainerStarted","Data":"9d1191245297017781f6f0d59c9e68f7b88c6be5d638855d55459cf690589f08"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.855877 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.856294 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" event={"ID":"a7788f82-4e6b-4d89-b009-0eca5b234009","Type":"ContainerStarted","Data":"a3c583618743072d3a8c63892ba8aed32eb325aba749dae0a0acd16ff2007d50"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.856322 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" event={"ID":"a7788f82-4e6b-4d89-b009-0eca5b234009","Type":"ContainerStarted","Data":"a026f18c23225d1771444fa7c9ac77f8b95183a72ab6190d1f7080aa1962c8b8"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.857605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nzgmv" event={"ID":"3071e162-d262-4732-81ca-10bb9b507321","Type":"ContainerStarted","Data":"4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.861316 4722 generic.go:334] "Generic (PLEG): container finished" podID="5693eddb-45a4-4cee-acb8-d3c0f23d16b8" containerID="1eaef094f51d48c22c87113b7f264cf4f078eb98ddaab98503078950021a15ac" exitCode=0 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.861394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" event={"ID":"5693eddb-45a4-4cee-acb8-d3c0f23d16b8","Type":"ContainerDied","Data":"1eaef094f51d48c22c87113b7f264cf4f078eb98ddaab98503078950021a15ac"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.861423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" event={"ID":"5693eddb-45a4-4cee-acb8-d3c0f23d16b8","Type":"ContainerStarted","Data":"f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.865040 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.866354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" event={"ID":"bf8b7b84-382a-410f-8dea-c4f485402a77","Type":"ContainerStarted","Data":"4524fe7e56fb120ed1f10ce6083b180ce28a4063125657070a3dd348cfebd5dd"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.866392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" event={"ID":"bf8b7b84-382a-410f-8dea-c4f485402a77","Type":"ContainerStarted","Data":"b449e993c7b506aa07679f65f11ca216831fe291a98147e01c75d4ae55f5d767"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.866404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" event={"ID":"bf8b7b84-382a-410f-8dea-c4f485402a77","Type":"ContainerStarted","Data":"db7a9df904e922606f774e25d285494e99875d1b8fcc171af6ed4dca16c8ade1"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.868510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2pqg\" (UniqueName: \"kubernetes.io/projected/bb502645-30c6-437d-abc3-28de80105939-kube-api-access-m2pqg\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.871640 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.872287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" event={"ID":"f00e2406-a55b-4e28-bed9-a060b0780301","Type":"ContainerStarted","Data":"82508cfbd650f98148fb226498a6f0165ea19ee56f6b84eb2bb880b85c8acc4e"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.874407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" event={"ID":"26779d4b-27e7-4bac-a4d8-5c312a6cec13","Type":"ContainerStarted","Data":"e9a59274a7088a93a2ec77294d99e283bb869e2553d73855111baa2c25d9f8dc"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.874464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" event={"ID":"26779d4b-27e7-4bac-a4d8-5c312a6cec13","Type":"ContainerStarted","Data":"bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.878835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.887623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86s9\" (UniqueName: \"kubernetes.io/projected/71548ff6-f831-48ba-af51-99fe431c447a-kube-api-access-l86s9\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.888835 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.901103 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.901447 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.401427898 +0000 UTC m=+144.013778222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.910443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjsx\" (UniqueName: \"kubernetes.io/projected/814d776b-73c6-4354-8195-da5d3ea2d5cb-kube-api-access-xvjsx\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.912207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerStarted","Data":"e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.919248 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" event={"ID":"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f","Type":"ContainerStarted","Data":"f0e89d11d34992a9d3e2fd6ed43adbfd1bd19f8094145800e8d76f0e8ae93eaf"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.925928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lg2rd" event={"ID":"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8","Type":"ContainerStarted","Data":"04081e8ba931886e1736ede667ab583c82c3730fd987315d4b52e12ed7c811d5"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.926541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.932662 4722 generic.go:334] "Generic (PLEG): container finished" podID="8c255c5e-d6d9-4772-9151-0065df6dc00d" containerID="5c44c055c09feba5ea63deb960006d6c67d2cfab710a3a6c6d5997ee7bb87a61" exitCode=0 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.932939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerDied","Data":"5c44c055c09feba5ea63deb960006d6c67d2cfab710a3a6c6d5997ee7bb87a61"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.932980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerStarted","Data":"accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.942370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" event={"ID":"2d21a014-83a9-43d9-9cdd-5e0897757c90","Type":"ContainerStarted","Data":"0930f6cbddbd346945e235439f5391cf2978ec9a51ea4717faa4f07b16a397fd"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.942636 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.946771 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.948078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtkgt\" (UniqueName: \"kubernetes.io/projected/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-kube-api-access-vtkgt\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.948464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerStarted","Data":"91e29c4c51cd956e7890c0dbe940cd28aaff5babb9d72cd9fb735cea262c06b2"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.956599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" event={"ID":"51679292-9818-418a-98d6-c442dc7d28e2","Type":"ContainerStarted","Data":"ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9"} Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.959843 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9da917_db10_4eba_bdff_f68354e8d4a6.slice/crio-1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8 WatchSource:0}: Error finding container 1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8: Status 404 returned error can't find the container with id 1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.961345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerStarted","Data":"0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.961392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerStarted","Data":"6e1b8dc29249f786b414083b626373283ac9d3f4f6727c121afc4a975d983b31"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.961758 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.963972 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.964283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" event={"ID":"01bb1078-2d76-42f4-919f-3d1b73a61fd4","Type":"ContainerStarted","Data":"f2e89e44666f131e73fb4bba7527eb9deff2a5a021dd6c53a89f611465012a71"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.975574 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.976059 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.977415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" event={"ID":"47339628-7112-4f7a-b949-fef983428ebe","Type":"ContainerStarted","Data":"39ba28e8024c70b504ad794063096c96e6db73f940777aa78abdfcf3c54bcde5"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.983567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" event={"ID":"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc","Type":"ContainerStarted","Data":"29b5691d6a5701744331cdb2fe2e088cb8011eb8feb9a1593c6083fcbeb3e44e"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.983609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" event={"ID":"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc","Type":"ContainerStarted","Data":"dcec5d7923af0ae25f7c1d25aefaa3e6bfe154ea00901d8bca40695a3dfc2f0c"} Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.003610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.004409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerStarted","Data":"d0c096f9abea14bd89e01cd5df78cfd43109b66f0678b624949e1ec87cdc1cd4"} Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.004974 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.504912743 +0000 UTC m=+144.117263127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.011444 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.034427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" event={"ID":"41fade82-0d8d-41b2-805e-8a92ffa97cf3","Type":"ContainerStarted","Data":"b7af01b146418b2959ab63e4c1e4bd3213696626eb16332d455a5d5ff7b805d8"} Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.072654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.089327 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.104529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.105817 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.605796346 +0000 UTC m=+144.218146680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.119930 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.122465 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.201784 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.205953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.207906 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.707888588 +0000 UTC m=+144.320238912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.232632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.240922 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda159fd13_de0a_46d3_971f_fb7c2fc652bd.slice/crio-1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe WatchSource:0}: Error finding container 1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe: Status 404 returned error can't find the container with id 1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.253795 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187676b8_1029_4153_9da5_6614e9b7892e.slice/crio-fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3 WatchSource:0}: Error finding container fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3: Status 404 returned error can't find the container with id fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3 Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.307063 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.307306 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.807284172 +0000 UTC m=+144.419634496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.408203 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814d776b_73c6_4354_8195_da5d3ea2d5cb.slice/crio-26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964 WatchSource:0}: Error finding container 26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964: Status 404 returned error can't find the container with id 26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964 Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.408312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.408659 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.90864229 +0000 UTC m=+144.520992674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.479398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.510522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.510686 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.010659769 +0000 UTC m=+144.623010093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.510916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.511249 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.011233537 +0000 UTC m=+144.623583861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.612582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.612863 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.112847953 +0000 UTC m=+144.725198277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.655422 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.675305 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.716576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.717328 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.21730622 +0000 UTC m=+144.829656544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.725433 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" podStartSLOduration=124.725413068 podStartE2EDuration="2m4.725413068s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:44.717780065 +0000 UTC m=+144.330130389" watchObservedRunningTime="2026-02-19 19:20:44.725413068 +0000 UTC m=+144.337763392" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.744990 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72z7j"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.761420 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podStartSLOduration=123.761401514 podStartE2EDuration="2m3.761401514s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:44.759296047 +0000 UTC m=+144.371646391" watchObservedRunningTime="2026-02-19 19:20:44.761401514 +0000 UTC m=+144.373751838" Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.773187 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce4ba5c_9c53_4a07_a57d_3c3532449ae8.slice/crio-f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4 WatchSource:0}: Error finding container f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4: Status 404 returned error can't find the container with id f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4 Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.813888 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bae0d8_92c9_40e9_ad8d_cc01467c8d93.slice/crio-dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771 WatchSource:0}: Error finding container dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771: Status 404 returned error can't find the container with id dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771 Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.818969 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.820047 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.320019721 +0000 UTC m=+144.932370085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.920905 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.922457 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.422445562 +0000 UTC m=+145.034795886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.950335 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kqs9s"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.957546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.969535 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc"] Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.975078 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb502645_30c6_437d_abc3_28de80105939.slice/crio-2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d WatchSource:0}: Error finding container 2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d: Status 404 returned error can't find the container with id 2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.000594 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" podStartSLOduration=125.00057357 podStartE2EDuration="2m5.00057357s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:44.999783745 +0000 UTC m=+144.612134069" watchObservedRunningTime="2026-02-19 19:20:45.00057357 +0000 UTC m=+144.612923914" Feb 19 19:20:45 crc kubenswrapper[4722]: W0219 19:20:45.020424 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod809781cd_b87f_423a_957c_0d20e074306e.slice/crio-20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb WatchSource:0}: Error finding container 20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb: Status 404 returned error can't find the container with id 20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.021562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.021709 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.521681103 +0000 UTC m=+145.134031427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.022521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.022865 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.5228509 +0000 UTC m=+145.135201224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.035025 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.048499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerStarted","Data":"3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.048810 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.052405 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4gbkr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.052458 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.053021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" event={"ID":"2d21a014-83a9-43d9-9cdd-5e0897757c90","Type":"ContainerStarted","Data":"4cfcd0939748cd1cb3b3ea5c2d67954b39a00eee17635f5db99d67b1fe3bc5db"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.053806 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.055100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" event={"ID":"51679292-9818-418a-98d6-c442dc7d28e2","Type":"ContainerStarted","Data":"e69bccb77c203b1feff23c4b9aad2d72dd2a1b7a82bb0d0989b26020a120dfe4"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.057246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" event={"ID":"7c9da917-db10-4eba-bdff-f68354e8d4a6","Type":"ContainerStarted","Data":"1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.059299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" event={"ID":"669833fe-83b5-4d4a-a78c-c360789f754b","Type":"ContainerStarted","Data":"ebd746d1c09688e84817367f69c217e579f80a9e47cf50d0256218caf1de358d"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.062258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nzgmv" event={"ID":"3071e162-d262-4732-81ca-10bb9b507321","Type":"ContainerStarted","Data":"00dbbdf46cd7472f445a575505944f76f6c027a639dda8c496d34165cf21eec9"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.079547 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.079579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerStarted","Data":"025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.086735 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ndzb8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.086798 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.099655 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" event={"ID":"38df6625-e726-49e8-9bff-561442dcea53","Type":"ContainerStarted","Data":"d57bee2e79a376f24991d94b9a7298bc56e0eec9133577bf5557eee2bfa5f917"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.102067 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vcmxn"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.116859 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.124972 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.125996 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.625980284 +0000 UTC m=+145.238330608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.164690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" event={"ID":"01bb1078-2d76-42f4-919f-3d1b73a61fd4","Type":"ContainerStarted","Data":"248c060a2310d665aa833656479a64a87327d7b2dd4e362af6c1c55dfa6c5ecd"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.165024 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nm78h"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.166950 4722 generic.go:334] "Generic (PLEG): container finished" podID="47339628-7112-4f7a-b949-fef983428ebe" containerID="efbbd4ce089c3aed8522c44ea57e3b1991a8206547e6f741cb11181d7ef0e7b0" exitCode=0 Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.167185 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" event={"ID":"47339628-7112-4f7a-b949-fef983428ebe","Type":"ContainerDied","Data":"efbbd4ce089c3aed8522c44ea57e3b1991a8206547e6f741cb11181d7ef0e7b0"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.178036 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8ppnm"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.181736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" event={"ID":"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f","Type":"ContainerStarted","Data":"c79963fb50d6dec9455c971ee05aceffe009984dbf652eb69dfcd58ebd97ad44"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.207721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerStarted","Data":"fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.208858 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.213287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" event={"ID":"809781cd-b87f-423a-957c-0d20e074306e","Type":"ContainerStarted","Data":"20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.224051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lg2rd" event={"ID":"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8","Type":"ContainerStarted","Data":"b249b0514a8cadd113dd409bbe53a0666baff045911e6906f6de48eee32345aa"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.224400 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.226984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.227804 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.727788436 +0000 UTC m=+145.340138760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.230220 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.230264 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.235549 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" event={"ID":"f00e2406-a55b-4e28-bed9-a060b0780301","Type":"ContainerStarted","Data":"5425dc9b7a416e558af8638bac8fc3e5f13a0614b6c23eac81fa13b38629a876"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.235842 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.237992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" event={"ID":"a159fd13-de0a-46d3-971f-fb7c2fc652bd","Type":"ContainerStarted","Data":"1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.245874 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-h4zk8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.245907 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" podUID="f00e2406-a55b-4e28-bed9-a060b0780301" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.246929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" event={"ID":"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7","Type":"ContainerStarted","Data":"5a94da18c8904d8a56c548e84229a34149eefe8b670a2f7f780f7da2003a71f8"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.254068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerStarted","Data":"c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.255428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.258524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" event={"ID":"41fade82-0d8d-41b2-805e-8a92ffa97cf3","Type":"ContainerStarted","Data":"35b587c3f20b8140ab0f86bba90ba106fa4be4654fc0e85a4c915fa3bc9aa2c1"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.262118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.267026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" event={"ID":"f5ac7e96-c772-449a-9e9d-d7dabfc6974e","Type":"ContainerStarted","Data":"88829366351bf7263601b984ad15bcb303806ca9ba3d2c5c1f86149792538e76"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.268242 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hj8tk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.268273 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.269677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerStarted","Data":"262b347f2b9a906cc2a369ed3ff2e9b2acf60ad338b20154c8999adf62f8801a"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.271236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" event={"ID":"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93","Type":"ContainerStarted","Data":"dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.278639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxzjr" event={"ID":"814d776b-73c6-4354-8195-da5d3ea2d5cb","Type":"ContainerStarted","Data":"26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.284996 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" event={"ID":"2d33e000-0a81-4601-8120-52dacf0b5d6b","Type":"ContainerStarted","Data":"47b1fb0e415251608ebf2cadce332bcf26820d58f70e5318532d73180da811f8"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.289678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" event={"ID":"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8","Type":"ContainerStarted","Data":"f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.292123 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.292232 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.328491 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.329912 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.829894068 +0000 UTC m=+145.442244402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.347857 4722 csr.go:261] certificate signing request csr-kbqk4 is approved, waiting to be issued Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.353221 4722 csr.go:257] certificate signing request csr-kbqk4 is issued Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.405772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" podStartSLOduration=125.405754314 podStartE2EDuration="2m5.405754314s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.403872263 +0000 UTC m=+145.016222597" watchObservedRunningTime="2026-02-19 19:20:45.405754314 +0000 UTC m=+145.018104638" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.432924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.435548 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.935536072 +0000 UTC m=+145.547886396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.485223 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.488860 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.488907 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.533741 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.533968 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.033946616 +0000 UTC m=+145.646296950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.534228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.534719 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.03469825 +0000 UTC m=+145.647048574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.635032 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.635583 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.135567742 +0000 UTC m=+145.747918066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.635725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.636050 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.136042767 +0000 UTC m=+145.748393091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.711995 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" podStartSLOduration=124.711975815 podStartE2EDuration="2m4.711975815s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.707613226 +0000 UTC m=+145.319963560" watchObservedRunningTime="2026-02-19 19:20:45.711975815 +0000 UTC m=+145.324326139" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.736839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.737191 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.237176467 +0000 UTC m=+145.849526791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.761344 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lg2rd" podStartSLOduration=124.761328467 podStartE2EDuration="2m4.761328467s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.759059725 +0000 UTC m=+145.371410049" watchObservedRunningTime="2026-02-19 19:20:45.761328467 +0000 UTC m=+145.373678781" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.841293 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" podStartSLOduration=124.841278203 podStartE2EDuration="2m4.841278203s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.839447435 +0000 UTC m=+145.451797759" watchObservedRunningTime="2026-02-19 19:20:45.841278203 +0000 UTC m=+145.453628527" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.842546 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" podStartSLOduration=124.842539843 podStartE2EDuration="2m4.842539843s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.787531611 +0000 UTC m=+145.399881935" watchObservedRunningTime="2026-02-19 19:20:45.842539843 +0000 UTC m=+145.454890167" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.845568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.846008 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.345995493 +0000 UTC m=+145.958345817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.875765 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podStartSLOduration=125.8757464 podStartE2EDuration="2m5.8757464s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.874619324 +0000 UTC m=+145.486969658" watchObservedRunningTime="2026-02-19 19:20:45.8757464 +0000 UTC m=+145.488096734" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.904887 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podStartSLOduration=124.904874347 podStartE2EDuration="2m4.904874347s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.904251218 +0000 UTC m=+145.516601532" watchObservedRunningTime="2026-02-19 19:20:45.904874347 +0000 UTC m=+145.517224671" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.943394 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" podStartSLOduration=125.943375884 podStartE2EDuration="2m5.943375884s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.943139057 +0000 UTC m=+145.555489381" watchObservedRunningTime="2026-02-19 19:20:45.943375884 +0000 UTC m=+145.555726208" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.946121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.946501 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.446474203 +0000 UTC m=+146.058824527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.024249 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" podStartSLOduration=125.024230659 podStartE2EDuration="2m5.024230659s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.020783269 +0000 UTC m=+145.633133593" watchObservedRunningTime="2026-02-19 19:20:46.024230659 +0000 UTC m=+145.636580983" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.049337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.049671 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.549660229 +0000 UTC m=+146.162010553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.065614 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" podStartSLOduration=125.065601086 podStartE2EDuration="2m5.065601086s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.064614614 +0000 UTC m=+145.676964948" watchObservedRunningTime="2026-02-19 19:20:46.065601086 +0000 UTC m=+145.677951410" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.102945 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hxzjr" podStartSLOduration=6.102929365 podStartE2EDuration="6.102929365s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.102286324 +0000 UTC m=+145.714636648" watchObservedRunningTime="2026-02-19 19:20:46.102929365 +0000 UTC m=+145.715279689" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.145002 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podStartSLOduration=125.144979044 podStartE2EDuration="2m5.144979044s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.141250215 +0000 UTC m=+145.753600549" watchObservedRunningTime="2026-02-19 19:20:46.144979044 +0000 UTC m=+145.757329368" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.158228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.158830 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.658807965 +0000 UTC m=+146.271158289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.186541 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" podStartSLOduration=125.186518477 podStartE2EDuration="2m5.186518477s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.1837741 +0000 UTC m=+145.796124424" watchObservedRunningTime="2026-02-19 19:20:46.186518477 +0000 UTC m=+145.798868801" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.234444 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nzgmv" podStartSLOduration=125.234425733 podStartE2EDuration="2m5.234425733s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.233606406 +0000 UTC m=+145.845956730" watchObservedRunningTime="2026-02-19 19:20:46.234425733 +0000 UTC m=+145.846776057" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.259734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.260090 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.76007872 +0000 UTC m=+146.372429044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.324121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" event={"ID":"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7","Type":"ContainerStarted","Data":"ef351aef5ad10b0d00005e8a1bd3c37aed6f5d4190aea65379d8fb9ecb740f99"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.324517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" event={"ID":"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7","Type":"ContainerStarted","Data":"786d0e815d19e5cfd8c18ce0bba924e838645f8b28cafacfc53ea2145457700b"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.327104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxzjr" event={"ID":"814d776b-73c6-4354-8195-da5d3ea2d5cb","Type":"ContainerStarted","Data":"3c1291b5cc9c8bc13b8c696b72a60f61ccde754e12194e77e57c3708607a8443"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.331223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" event={"ID":"2d33e000-0a81-4601-8120-52dacf0b5d6b","Type":"ContainerStarted","Data":"5501c3eb081dc14e56cf70a5f37ecc5b1996c3befffc69aff6179730a9b0b0e7"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.331261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" event={"ID":"2d33e000-0a81-4601-8120-52dacf0b5d6b","Type":"ContainerStarted","Data":"f03d0121879c803e8e680997d52d61aa79e2f6261d842e87232ef1598453a3ca"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.349134 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nm78h" event={"ID":"71548ff6-f831-48ba-af51-99fe431c447a","Type":"ContainerStarted","Data":"a6afc5bb0151fb3589a5f1b4d48148be1a0aca54a76983d626de4faa6cde1adf"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.349206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nm78h" event={"ID":"71548ff6-f831-48ba-af51-99fe431c447a","Type":"ContainerStarted","Data":"a278268985d8271886d90a06c39fe28e63a5048150219f6a164b404e21d49a34"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.350773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" event={"ID":"e040b47b-3688-40e2-a410-0dfa43ad8ef3","Type":"ContainerStarted","Data":"a1a12887074bc7205f6d66a879c77c0ef4f7f3a8bf0793e8b00c566ebf76f769"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.350805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" event={"ID":"e040b47b-3688-40e2-a410-0dfa43ad8ef3","Type":"ContainerStarted","Data":"7ab0721f9382bb0a61d42969ec2a0016a1de1ce2798238d6001286a67f8fe122"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.350814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" event={"ID":"e040b47b-3688-40e2-a410-0dfa43ad8ef3","Type":"ContainerStarted","Data":"77432a2b3182197a106526e3411ba4b9afdc0787a579a0b07be58df90225d826"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.351499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.352489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" event={"ID":"03c35cd8-4a1b-4847-a5f2-0fe0e884d191","Type":"ContainerStarted","Data":"404220fba85b8f4cd94ce26daab4420694362116ee06580bf0d58f15925b8851"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.352514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" event={"ID":"03c35cd8-4a1b-4847-a5f2-0fe0e884d191","Type":"ContainerStarted","Data":"95f7af19afa99ff5b31abc67f240a02dc2cf9faeb66190d33c89d73e50c1a4e4"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.355214 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 19:15:45 +0000 UTC, rotation deadline is 2026-12-24 08:05:44.800030569 +0000 UTC Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.355245 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7380h44m58.444787319s for next certificate rotation Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.360737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.361099 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.861086116 +0000 UTC m=+146.473436440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.376904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" event={"ID":"01bb1078-2d76-42f4-919f-3d1b73a61fd4","Type":"ContainerStarted","Data":"ed817a0aa20bb1d426865f86a0c95ad06b1484dc026bcf82224806954734cd7f"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.389795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" event={"ID":"f5ac7e96-c772-449a-9e9d-d7dabfc6974e","Type":"ContainerStarted","Data":"a18c29ec58d9c9ca68cc394cf953a870b83d5976024ff4f95bb88828c583b002"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.389839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" event={"ID":"f5ac7e96-c772-449a-9e9d-d7dabfc6974e","Type":"ContainerStarted","Data":"25bec7bbf6562e00857247154a6af48291dca8d73706c0f322cd88f5d9d09f1a"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.408013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerStarted","Data":"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.427600 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" podStartSLOduration=125.427584434 podStartE2EDuration="2m5.427584434s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.422026347 +0000 UTC m=+146.034376671" watchObservedRunningTime="2026-02-19 19:20:46.427584434 +0000 UTC m=+146.039934758" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.432441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" event={"ID":"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b","Type":"ContainerStarted","Data":"f16364e37b28f1c4da2487c503fce09e28b2dbcdc002629b73d6491128b47ad4"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.433627 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.442422 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cjtjp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.442477 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" podUID="e8ae2d71-7578-4343-a1ba-5d414cd1cc4b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.446304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" event={"ID":"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8","Type":"ContainerStarted","Data":"8425b86e33f55bb836d35d4634244ca0cb341b776d588b53581c9e30ed8a79f9"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.448203 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" event={"ID":"2d21a014-83a9-43d9-9cdd-5e0897757c90","Type":"ContainerStarted","Data":"3eaf5e0d17a916f529d58c56a948245ee445ce0abc5547632c8daa87ac6ef597"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.449454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vcmxn" event={"ID":"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80","Type":"ContainerStarted","Data":"14c6449b96d273476e5619cfa1df06a0ddf8ad4b241c5b2988dd87c905f18a5b"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.449480 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vcmxn" event={"ID":"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80","Type":"ContainerStarted","Data":"ca5e2d4d2c79c35beb41d1f1533d07d91ac580bc8fddf03fd37c30b0b401bacb"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.450831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" event={"ID":"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93","Type":"ContainerStarted","Data":"ecab273ca6f09e83f8d938ca5a5c7b07e951b67b29f42d5c366966f39555be18"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.450887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" event={"ID":"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93","Type":"ContainerStarted","Data":"ca577788ccc8003f8c58ecd6fd8b1b2af1187a4ea0c3dc28431fa6208739fa3f"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.462231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.463381 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.963366933 +0000 UTC m=+146.575717257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.468911 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" podStartSLOduration=125.468896699 podStartE2EDuration="2m5.468896699s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.467393662 +0000 UTC m=+146.079743986" watchObservedRunningTime="2026-02-19 19:20:46.468896699 +0000 UTC m=+146.081247023" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.468965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerStarted","Data":"8f83e76bc690bdd57bc344bba92f7c88169120f7b1c51ac629ed1570364c73e1"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.469008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerStarted","Data":"ad52433eac2761f6de6f0241da97dc4626ecf152358f037cf4dab8f8735ee9bc"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.470338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" event={"ID":"a159fd13-de0a-46d3-971f-fb7c2fc652bd","Type":"ContainerStarted","Data":"eec25437ba8663aa5716ec06c5513a50e649322d003af3053d48042e55a26585"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.483546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" event={"ID":"6a5359b9-b29a-4c86-8dc8-f00b659cecb0","Type":"ContainerStarted","Data":"e4ec9499ce1bafa92ddfa04e1301528389d974ea0247bb6448d9b10dff8fad90"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.483593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" event={"ID":"6a5359b9-b29a-4c86-8dc8-f00b659cecb0","Type":"ContainerStarted","Data":"f9f897c30862be4ab75fbe152f33b6d3f00620127a0529adf844cce1b1651c26"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.484197 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.485243 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hwl66 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.485295 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" podUID="6a5359b9-b29a-4c86-8dc8-f00b659cecb0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.485624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" event={"ID":"669833fe-83b5-4d4a-a78c-c360789f754b","Type":"ContainerStarted","Data":"527d984855f487908220402ab59141e5bdeb122d89fb13cdd73d16244ee7006d"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.489365 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:46 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:46 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:46 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.489428 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.497473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" event={"ID":"b3ea57e2-def2-4a73-a86b-75be99e36e46","Type":"ContainerStarted","Data":"fa6a7a9c8534399f7df0974921698b483c39f23b873ed23dfb9c08f6107c7e10"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.497521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" event={"ID":"b3ea57e2-def2-4a73-a86b-75be99e36e46","Type":"ContainerStarted","Data":"3119525aa03a6ddfce98df5b848e0a0d1c9d02178595305e6d64992ed8ef5567"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.520642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" event={"ID":"47339628-7112-4f7a-b949-fef983428ebe","Type":"ContainerStarted","Data":"b6d79205a0222bfdb6247fb125a326c0b7cf14f6f3d22ce6eda0bdeb3bb4b4fe"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.521224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.521923 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" podStartSLOduration=125.521908527 podStartE2EDuration="2m5.521908527s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.520495843 +0000 UTC m=+146.132846167" watchObservedRunningTime="2026-02-19 19:20:46.521908527 +0000 UTC m=+146.134258851" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.537577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" event={"ID":"809781cd-b87f-423a-957c-0d20e074306e","Type":"ContainerStarted","Data":"c136cc52ee912bf0f61c2dc6896709fa6e408d5cb6acf70b2dcbdcdda1cb3f18"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.549309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerStarted","Data":"2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.560367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" event={"ID":"f25b1c29-b400-4bd5-8e63-ac31629a0aa2","Type":"ContainerStarted","Data":"b691163b677c910509674f6db5d9ff696e93d9b243cff4172a82c0c83912a8be"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.560409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" event={"ID":"f25b1c29-b400-4bd5-8e63-ac31629a0aa2","Type":"ContainerStarted","Data":"6c6b6c9401ac9dbd4343886936faef598dc029884f065d81df8c7afe5961220e"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.562701 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" event={"ID":"5693eddb-45a4-4cee-acb8-d3c0f23d16b8","Type":"ContainerStarted","Data":"c89ce2b280b428c740d5bd22287118dd56d4145a7504015a6bb3d6f0e5e982be"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.565371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.566956 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.066941771 +0000 UTC m=+146.679292095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.572582 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" podStartSLOduration=125.572566211 podStartE2EDuration="2m5.572566211s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.571431254 +0000 UTC m=+146.183781578" watchObservedRunningTime="2026-02-19 19:20:46.572566211 +0000 UTC m=+146.184916535" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.577707 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" event={"ID":"7c9da917-db10-4eba-bdff-f68354e8d4a6","Type":"ContainerStarted","Data":"caed6afe54d834d8d6a61929597731be14994252ce2dbaa2f9ca830772213232"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.577747 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582632 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-h4zk8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582663 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4gbkr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582682 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" podUID="f00e2406-a55b-4e28-bed9-a060b0780301" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582715 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.583791 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.583811 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.605390 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-klvwp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.605436 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" podUID="7c9da917-db10-4eba-bdff-f68354e8d4a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.625595 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" podStartSLOduration=125.625578189 podStartE2EDuration="2m5.625578189s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.622697627 +0000 UTC m=+146.235047951" watchObservedRunningTime="2026-02-19 19:20:46.625578189 +0000 UTC m=+146.237928513" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.672173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.673781 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.173765203 +0000 UTC m=+146.786115527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.682717 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-txlzt" podStartSLOduration=125.682703158 podStartE2EDuration="2m5.682703158s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.663524597 +0000 UTC m=+146.275874921" watchObservedRunningTime="2026-02-19 19:20:46.682703158 +0000 UTC m=+146.295053482" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.731202 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" podStartSLOduration=125.731180011 podStartE2EDuration="2m5.731180011s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.730852951 +0000 UTC m=+146.343203275" watchObservedRunningTime="2026-02-19 19:20:46.731180011 +0000 UTC m=+146.343530335" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.733335 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" podStartSLOduration=125.7333277 podStartE2EDuration="2m5.7333277s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.683858385 +0000 UTC m=+146.296208709" watchObservedRunningTime="2026-02-19 19:20:46.7333277 +0000 UTC m=+146.345678024" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.756702 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" podStartSLOduration=125.756683564 podStartE2EDuration="2m5.756683564s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.755400303 +0000 UTC m=+146.367750627" watchObservedRunningTime="2026-02-19 19:20:46.756683564 +0000 UTC m=+146.369033878" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.775636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.775848 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.275832083 +0000 UTC m=+146.888182407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.775933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.776230 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.276224446 +0000 UTC m=+146.888574770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.781502 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" podStartSLOduration=125.781486743 podStartE2EDuration="2m5.781486743s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.772488038 +0000 UTC m=+146.384838372" watchObservedRunningTime="2026-02-19 19:20:46.781486743 +0000 UTC m=+146.393837067" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.842293 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" podStartSLOduration=125.84227218 podStartE2EDuration="2m5.84227218s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.808312668 +0000 UTC m=+146.420662992" watchObservedRunningTime="2026-02-19 19:20:46.84227218 +0000 UTC m=+146.454622504" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.843837 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" podStartSLOduration=125.843829599 podStartE2EDuration="2m5.843829599s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.841259867 +0000 UTC m=+146.453610191" watchObservedRunningTime="2026-02-19 19:20:46.843829599 +0000 UTC m=+146.456179923" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.884523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.885003 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.384987239 +0000 UTC m=+146.997337563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.925675 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" podStartSLOduration=125.925655975 podStartE2EDuration="2m5.925655975s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.892412746 +0000 UTC m=+146.504763060" watchObservedRunningTime="2026-02-19 19:20:46.925655975 +0000 UTC m=+146.538006299" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.927942 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vcmxn" podStartSLOduration=6.927934447 podStartE2EDuration="6.927934447s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.925527511 +0000 UTC m=+146.537877835" watchObservedRunningTime="2026-02-19 19:20:46.927934447 +0000 UTC m=+146.540284761" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.951033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" podStartSLOduration=125.951017763 podStartE2EDuration="2m5.951017763s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.949631138 +0000 UTC m=+146.561981462" watchObservedRunningTime="2026-02-19 19:20:46.951017763 +0000 UTC m=+146.563368087" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.986531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.986887 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.486875354 +0000 UTC m=+147.099225678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.011572 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" podStartSLOduration=126.0115507 podStartE2EDuration="2m6.0115507s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.010511388 +0000 UTC m=+146.622861712" watchObservedRunningTime="2026-02-19 19:20:47.0115507 +0000 UTC m=+146.623901024" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.069344 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" podStartSLOduration=126.069330191 podStartE2EDuration="2m6.069330191s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.05486979 +0000 UTC m=+146.667220114" watchObservedRunningTime="2026-02-19 19:20:47.069330191 +0000 UTC m=+146.681680515" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.087540 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.087860 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.58784613 +0000 UTC m=+147.200196454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.120673 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" podStartSLOduration=126.120659624 podStartE2EDuration="2m6.120659624s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.119658393 +0000 UTC m=+146.732008717" watchObservedRunningTime="2026-02-19 19:20:47.120659624 +0000 UTC m=+146.733009948" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.152053 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" podStartSLOduration=126.152037084 podStartE2EDuration="2m6.152037084s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.148668826 +0000 UTC m=+146.761019150" watchObservedRunningTime="2026-02-19 19:20:47.152037084 +0000 UTC m=+146.764387408" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.188591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.188949 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.688939159 +0000 UTC m=+147.301289483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.246672 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" podStartSLOduration=127.246632976 podStartE2EDuration="2m7.246632976s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.206920431 +0000 UTC m=+146.819270755" watchObservedRunningTime="2026-02-19 19:20:47.246632976 +0000 UTC m=+146.858983300" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.277528 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" podStartSLOduration=127.27751391 podStartE2EDuration="2m7.27751391s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.276631921 +0000 UTC m=+146.888982235" watchObservedRunningTime="2026-02-19 19:20:47.27751391 +0000 UTC m=+146.889864234" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.278161 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" podStartSLOduration=127.2781431 podStartE2EDuration="2m7.2781431s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.248751983 +0000 UTC m=+146.861102307" watchObservedRunningTime="2026-02-19 19:20:47.2781431 +0000 UTC m=+146.890493424" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.289346 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.289671 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.789656677 +0000 UTC m=+147.402007001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.363890 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.390536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.390869 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.890858609 +0000 UTC m=+147.503208933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.486650 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:47 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:47 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:47 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.486726 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.491355 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.491713 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.991698271 +0000 UTC m=+147.604048595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.579779 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ndzb8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.579835 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.582428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"60dcf0b9a4be5ab5a34b9f2bd3abcff72a4518eb236ec8de365ec62dad633e02"} Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.585025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nm78h" event={"ID":"71548ff6-f831-48ba-af51-99fe431c447a","Type":"ContainerStarted","Data":"46422a26465d6e26c663457d9b147ecab7b8595a4e1a3b38e7524741f6b348d9"} Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.592215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.592365 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.592415 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.593661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.593986 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.093974677 +0000 UTC m=+147.706325001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.596410 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.625839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.625887 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.630001 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nm78h" podStartSLOduration=7.629989344 podStartE2EDuration="7.629989344s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.629810478 +0000 UTC m=+147.242160812" watchObservedRunningTime="2026-02-19 19:20:47.629989344 +0000 UTC m=+147.242339668" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.657225 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.694091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.694296 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.19423801 +0000 UTC m=+147.806588334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.695314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.698250 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.198232898 +0000 UTC m=+147.810583262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.797382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.797732 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.297716025 +0000 UTC m=+147.910066349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.824239 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.824499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.835264 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bg6mf container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.27:8443/livez\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.835311 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" podUID="8c255c5e-d6d9-4772-9151-0065df6dc00d" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.27:8443/livez\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.900060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.900415 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.400400736 +0000 UTC m=+148.012751060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.000735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.000916 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.500888656 +0000 UTC m=+148.113238990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.001100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.001473 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.501463064 +0000 UTC m=+148.113813458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.055418 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.102359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.102566 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.602533392 +0000 UTC m=+148.214883726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.102901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.103322 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.603311327 +0000 UTC m=+148.215661651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.204509 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.204714 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.704683966 +0000 UTC m=+148.317034300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.204867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.205194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.705182412 +0000 UTC m=+148.317532736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.305917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.306113 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.806086285 +0000 UTC m=+148.418436609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.306178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.306469 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.806457417 +0000 UTC m=+148.418807741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407938 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407960 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.408112 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.908088373 +0000 UTC m=+148.520438697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.409008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.415264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.418765 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.426779 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.492374 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:48 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:48 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:48 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.492429 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.508919 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.509313 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.009296826 +0000 UTC m=+148.621647150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.589365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"1c40fc8578525e5a466a0589c438b55ae6944e80d0dfa7a83da7efbd3e8cc78c"} Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.591960 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-klvwp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.592006 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" podUID="7c9da917-db10-4eba-bdff-f68354e8d4a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.594801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.606859 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.609757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.610022 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.110007723 +0000 UTC m=+148.722358037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.626756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.634239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.711364 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.711793 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.211778244 +0000 UTC m=+148.824128568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.812651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.812790 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.31277216 +0000 UTC m=+148.925122484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.812914 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.813223 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.313215875 +0000 UTC m=+148.925566189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.850705 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.915367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.915630 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.415615335 +0000 UTC m=+149.027965659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.016843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.017229 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.517212331 +0000 UTC m=+149.129562655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.119096 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.119440 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.619425945 +0000 UTC m=+149.231776269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.221583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.222275 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.722262431 +0000 UTC m=+149.334612755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.319780 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.320937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.323790 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.325132 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.825088285 +0000 UTC m=+149.437438609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.325306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.325633 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.825621832 +0000 UTC m=+149.437972156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.326193 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.336087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.429515 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.929499959 +0000 UTC m=+149.541850283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.488454 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:49 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:49 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:49 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.488504 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.521773 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.523412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.523891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.529093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534086 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534142 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.535125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.535365 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.035354471 +0000 UTC m=+149.647704795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.547463 4722 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.568204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.610629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2bb4965fdf013ff81485111c60e1de5bff0cb3ec10055f3efa2e334f0dc3ab98"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.610674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"aaca4df240278c17008fc36a63273475dea5aa9226711311ac6aa6f2839afb8a"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.613931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ce36a6ef0ea013be2fca11d6b0e284251ec8550e4f4352ef961ee5bf851c6d00"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.613962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"58d302fbb4956b3b899f9b13decd7899140aa1d4f3be3e194d3745a01343d0c1"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.614327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.619352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1415ed088f539e032c87a251625d406548f68ab5172a3fa9d829a6a5ae0f184c"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.619408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8114d3a85535aeaa72f4582e9630b8967c980d11a0f614ba1c90371f598749ce"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636319 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"9943299ced272b97fa61876ac6166d28f9833a1d6f5199b897797517c94c4426"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636812 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"4509d1d5ead6056c1f093d934c41d3443f253286e3693d084ef23edbe7ddc5d0"} Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.636875 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.136858873 +0000 UTC m=+149.749209197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.666396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.700065 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" podStartSLOduration=9.700045476 podStartE2EDuration="9.700045476s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:49.697269097 +0000 UTC m=+149.309619421" watchObservedRunningTime="2026-02-19 19:20:49.700045476 +0000 UTC m=+149.312395800" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.725337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.726362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.738722 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.238707887 +0000 UTC m=+149.851058211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.740932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.741252 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.742310 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.800776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.840611 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.841971 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.842115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.842300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.842363 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.342322506 +0000 UTC m=+149.954672830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.858401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.934605 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.935586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.949499 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.449482429 +0000 UTC m=+150.061832753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.951090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.952110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.989675 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.006904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.055770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.055993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.056045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.056091 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: E0219 19:20:50.056201 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.556186067 +0000 UTC m=+150.168536391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.060399 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.102466 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.158421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.158681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: E0219 19:20:50.160000 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.659984493 +0000 UTC m=+150.272334817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.182910 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.245225 4722 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T19:20:49.547484837Z","Handler":null,"Name":""} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.255668 4722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.255705 4722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.258640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.282327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.293447 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.334458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.366011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.376931 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.376974 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.424262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.487213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.492276 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:50 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:50 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:50 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.492319 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.606170 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.643144 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3632-a132-4377-95ef-564cffb1f299" containerID="83c9ec76be9f3502d89c676d78e714eeea9b0340976175aeadfd0dc3726f4500" exitCode=0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.643253 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"83c9ec76be9f3502d89c676d78e714eeea9b0340976175aeadfd0dc3726f4500"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.643283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerStarted","Data":"d33d192f020b6508198a4a19887938ad42d94be353afef74a8413b4aa30e91d1"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644601 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644702 4722 generic.go:334] "Generic (PLEG): container finished" podID="c594681e-de0b-4b39-98d3-573c9170c898" containerID="83d63174a5dee0510e001a33beae280a6c56b7d09645762d8197fc6948f07c46" exitCode=0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"83d63174a5dee0510e001a33beae280a6c56b7d09645762d8197fc6948f07c46"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerStarted","Data":"9b378dd4da61b5af99f5f93bba7c15d0d04355aa249d4e89b10b4d368ec3db4e"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.648129 4722 generic.go:334] "Generic (PLEG): container finished" podID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" exitCode=0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.648345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.648384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerStarted","Data":"8c40a4539d5d6930a5a906cb44965a1810a1f2192dbfb01db14eeaf97f5cc6ee"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.676558 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.912224 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.948756 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.949385 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.951528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.951645 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.961461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.079598 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.087915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.088030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.188549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.188654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.188717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.214203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.262686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.477552 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.486333 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:51 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:51 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:51 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.486387 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.503591 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.504558 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.512568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.519257 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.595360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.595838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.595872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.660593 4722 generic.go:334] "Generic (PLEG): container finished" podID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerID="2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b" exitCode=0 Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.660653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerDied","Data":"2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.665447 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerID="d8152997987bda50dd12277fbfbc9da38a131bf85945cd167cb7db72d9b9372b" exitCode=0 Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.665552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"d8152997987bda50dd12277fbfbc9da38a131bf85945cd167cb7db72d9b9372b"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.665585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerStarted","Data":"ff5ad27012e651ea99b2c5454cf7b789a1c44ed2c936a800e67aa01d7e7683b4"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.667517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"595d34f9-545d-47de-9a83-bd6210f4fe5e","Type":"ContainerStarted","Data":"1d9b1f4a4cce5c7d90fbde391391db7e28d94c0e18c38a34002436f351a36014"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.670434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerStarted","Data":"b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.670465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerStarted","Data":"7fc589c7d609f9f8ea97795796aadbb293f365ba97d8b385ba4c6ea2f33eb413"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.670926 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.696589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.696642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.696711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.697383 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.699476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.718110 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" podStartSLOduration=130.718015948 podStartE2EDuration="2m10.718015948s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:51.715088985 +0000 UTC m=+151.327439309" watchObservedRunningTime="2026-02-19 19:20:51.718015948 +0000 UTC m=+151.330366272" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.734118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.835180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.901624 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.902897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.912877 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.000781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.001069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.001099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.102726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.102833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.102861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.103241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.103357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.122658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.193706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.230693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.436330 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:20:52 crc kubenswrapper[4722]: W0219 19:20:52.444175 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad9ab6b_efbe_4d01_97b0_281ee8a199df.slice/crio-8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d WatchSource:0}: Error finding container 8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d: Status 404 returned error can't find the container with id 8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.486027 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:52 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:52 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:52 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.486089 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.502057 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.503586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.506424 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.518521 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.610277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.610352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.610393 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.681072 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.690658 4722 generic.go:334] "Generic (PLEG): container finished" podID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerID="0dd65a739e9f5e8ad490009cf2eebc6f6859f0fe25f4e418d1b7a49467014a17" exitCode=0 Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.690746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"0dd65a739e9f5e8ad490009cf2eebc6f6859f0fe25f4e418d1b7a49467014a17"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.690777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerStarted","Data":"8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.692948 4722 generic.go:334] "Generic (PLEG): container finished" podID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerID="b18b66df00aa2ddebb51af6a1a5323f2f0daccf9de4d9b58aaa55e91465e07a5" exitCode=0 Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.693036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"595d34f9-545d-47de-9a83-bd6210f4fe5e","Type":"ContainerDied","Data":"b18b66df00aa2ddebb51af6a1a5323f2f0daccf9de4d9b58aaa55e91465e07a5"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.697440 4722 generic.go:334] "Generic (PLEG): container finished" podID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerID="b5c97b5b76e7afa24f8f93363368d20e4563b18ad7e8eaf0a0672fe76a243f0a" exitCode=0 Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.697493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"b5c97b5b76e7afa24f8f93363368d20e4563b18ad7e8eaf0a0672fe76a243f0a"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.697555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerStarted","Data":"104233a8c5f814fc84e4081cc01af39a90044fcd055492fd733214b7e3b634d4"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.711588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.711644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.711692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.712443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.712693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.736842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.828047 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.835657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847058 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847088 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847087 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847837 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847181 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.907162 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.908366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.921425 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.998576 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.017930 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.017982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.018449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.018498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.018546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.124550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"0d5e5981-45e4-4970-bff2-17a6087915e9\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.124844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"0d5e5981-45e4-4970-bff2-17a6087915e9\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.124905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"0d5e5981-45e4-4970-bff2-17a6087915e9\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.125083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.125196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.125288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.126630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.128812 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d5e5981-45e4-4970-bff2-17a6087915e9" (UID: "0d5e5981-45e4-4970-bff2-17a6087915e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.129154 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.140314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz" (OuterVolumeSpecName: "kube-api-access-xgsrz") pod "0d5e5981-45e4-4970-bff2-17a6087915e9" (UID: "0d5e5981-45e4-4970-bff2-17a6087915e9"). InnerVolumeSpecName "kube-api-access-xgsrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.140860 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0d5e5981-45e4-4970-bff2-17a6087915e9" (UID: "0d5e5981-45e4-4970-bff2-17a6087915e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.146126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.226124 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.226167 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.226193 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.242117 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.246535 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.319330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.475828 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.482936 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.486956 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:53 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:53 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:53 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.487008 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.512068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:20:53 crc kubenswrapper[4722]: W0219 19:20:53.554768 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12054322_fe1e_4205_b6d3_05b30024a987.slice/crio-8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b WatchSource:0}: Error finding container 8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b: Status 404 returned error can't find the container with id 8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.708261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerStarted","Data":"8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.714510 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerDied","Data":"262b347f2b9a906cc2a369ed3ff2e9b2acf60ad338b20154c8999adf62f8801a"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.714590 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262b347f2b9a906cc2a369ed3ff2e9b2acf60ad338b20154c8999adf62f8801a" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.714676 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.720894 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerID="78d9b73635fb9fd918479e49197028103f67da7ed33002bbffe05da3a4ec4523" exitCode=0 Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.722350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"78d9b73635fb9fd918479e49197028103f67da7ed33002bbffe05da3a4ec4523"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.722374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerStarted","Data":"1c1bf847d9c8bd6cdac4a8d78654087bcd70cd49df2904b71c207590aa5bdd28"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.762728 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.764010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.764379 4722 patch_prober.go:28] interesting pod/console-f9d7485db-txlzt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.764415 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-txlzt" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.921457 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.040928 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"595d34f9-545d-47de-9a83-bd6210f4fe5e\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.041016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"595d34f9-545d-47de-9a83-bd6210f4fe5e\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.041413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "595d34f9-545d-47de-9a83-bd6210f4fe5e" (UID: "595d34f9-545d-47de-9a83-bd6210f4fe5e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.046539 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "595d34f9-545d-47de-9a83-bd6210f4fe5e" (UID: "595d34f9-545d-47de-9a83-bd6210f4fe5e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.142630 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.142661 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.490454 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:54 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:54 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:54 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.490537 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.730141 4722 generic.go:334] "Generic (PLEG): container finished" podID="12054322-fe1e-4205-b6d3-05b30024a987" containerID="57d551ccacbc04d55c2cac5a3bb7ceb078d63f2d275222bd8c776cbc6fad014d" exitCode=0 Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.730384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"57d551ccacbc04d55c2cac5a3bb7ceb078d63f2d275222bd8c776cbc6fad014d"} Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.749351 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.752882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"595d34f9-545d-47de-9a83-bd6210f4fe5e","Type":"ContainerDied","Data":"1d9b1f4a4cce5c7d90fbde391391db7e28d94c0e18c38a34002436f351a36014"} Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.752931 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d9b1f4a4cce5c7d90fbde391391db7e28d94c0e18c38a34002436f351a36014" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.871814 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 19:20:54 crc kubenswrapper[4722]: E0219 19:20:54.872404 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerName="pruner" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872424 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerName="pruner" Feb 19 19:20:54 crc kubenswrapper[4722]: E0219 19:20:54.872440 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerName="collect-profiles" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872448 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerName="collect-profiles" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872599 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerName="collect-profiles" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872616 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerName="pruner" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.873055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.874968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.875158 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.882387 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.957294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.957410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.058973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.059025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.059431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.078299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.250936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.487206 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:55 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:55 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:55 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.487616 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.601716 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 19:20:55 crc kubenswrapper[4722]: W0219 19:20:55.616268 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod40cd9afa_751d_46e0_b482_2098a89d2840.slice/crio-6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80 WatchSource:0}: Error finding container 6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80: Status 404 returned error can't find the container with id 6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80 Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.759976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerStarted","Data":"6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80"} Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.945535 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:56 crc kubenswrapper[4722]: I0219 19:20:56.487166 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:56 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:56 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:56 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:56 crc kubenswrapper[4722]: I0219 19:20:56.487225 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:56 crc kubenswrapper[4722]: I0219 19:20:56.780298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerStarted","Data":"e80b5b86c31a5449e059431e415e898753f7e3e206ec49bc3aceea682dd84694"} Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.485747 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:57 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:57 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:57 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.485801 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.794612 4722 generic.go:334] "Generic (PLEG): container finished" podID="40cd9afa-751d-46e0-b482-2098a89d2840" containerID="e80b5b86c31a5449e059431e415e898753f7e3e206ec49bc3aceea682dd84694" exitCode=0 Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.794660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerDied","Data":"e80b5b86c31a5449e059431e415e898753f7e3e206ec49bc3aceea682dd84694"} Feb 19 19:20:58 crc kubenswrapper[4722]: I0219 19:20:58.484847 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:58 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:58 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:58 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:58 crc kubenswrapper[4722]: I0219 19:20:58.484920 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:59 crc kubenswrapper[4722]: I0219 19:20:59.485281 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:59 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:59 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:59 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:59 crc kubenswrapper[4722]: I0219 19:20:59.485946 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:00 crc kubenswrapper[4722]: I0219 19:21:00.486166 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:21:00 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:21:00 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:21:00 crc kubenswrapper[4722]: healthz check failed Feb 19 19:21:00 crc kubenswrapper[4722]: I0219 19:21:00.486432 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:01 crc kubenswrapper[4722]: I0219 19:21:01.486222 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:21:01 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:21:01 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:21:01 crc kubenswrapper[4722]: healthz check failed Feb 19 19:21:01 crc kubenswrapper[4722]: I0219 19:21:01.486302 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:02 crc kubenswrapper[4722]: I0219 19:21:02.484803 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:21:02 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:21:02 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:21:02 crc kubenswrapper[4722]: healthz check failed Feb 19 19:21:02 crc kubenswrapper[4722]: I0219 19:21:02.485130 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:02 crc kubenswrapper[4722]: I0219 19:21:02.849834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.207680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.215860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.292784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.500588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.503216 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.763181 4722 patch_prober.go:28] interesting pod/console-f9d7485db-txlzt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.763235 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-txlzt" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.740700 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.841769 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"40cd9afa-751d-46e0-b482-2098a89d2840\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.841830 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"40cd9afa-751d-46e0-b482-2098a89d2840\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.841966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "40cd9afa-751d-46e0-b482-2098a89d2840" (UID: "40cd9afa-751d-46e0-b482-2098a89d2840"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.842311 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.847274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "40cd9afa-751d-46e0-b482-2098a89d2840" (UID: "40cd9afa-751d-46e0-b482-2098a89d2840"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.943423 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:06 crc kubenswrapper[4722]: I0219 19:21:06.148358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerDied","Data":"6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80"} Feb 19 19:21:06 crc kubenswrapper[4722]: I0219 19:21:06.148407 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80" Feb 19 19:21:06 crc kubenswrapper[4722]: I0219 19:21:06.148438 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.471057 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.471981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" containerID="cri-o://0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5" gracePeriod=30 Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.480252 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.480540 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" containerID="cri-o://c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34" gracePeriod=30 Feb 19 19:21:10 crc kubenswrapper[4722]: I0219 19:21:10.686956 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:21:11 crc kubenswrapper[4722]: I0219 19:21:11.798525 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:21:11 crc kubenswrapper[4722]: I0219 19:21:11.798595 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:12 crc kubenswrapper[4722]: I0219 19:21:12.835752 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hj8tk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 19 19:21:12 crc kubenswrapper[4722]: I0219 19:21:12.836212 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.675103 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.675190 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.769947 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.776538 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.214455 4722 generic.go:334] "Generic (PLEG): container finished" podID="c1782da0-924a-481b-b0fc-20050e168591" containerID="c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34" exitCode=0 Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.214586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerDied","Data":"c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34"} Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.216405 4722 generic.go:334] "Generic (PLEG): container finished" podID="4ded6995-db61-4962-a375-ba80816b8df9" containerID="0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5" exitCode=0 Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.216441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerDied","Data":"0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5"} Feb 19 19:21:22 crc kubenswrapper[4722]: E0219 19:21:22.130768 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 19:21:22 crc kubenswrapper[4722]: E0219 19:21:22.131633 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz4g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vqqrf_openshift-marketplace(f10dae1c-d938-4cce-893b-4ad7eca7d23f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:22 crc kubenswrapper[4722]: E0219 19:21:22.132855 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" Feb 19 19:21:22 crc kubenswrapper[4722]: I0219 19:21:22.836029 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hj8tk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 19 19:21:22 crc kubenswrapper[4722]: I0219 19:21:22.836083 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.518595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.575063 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.575256 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhxq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-j86kw_openshift-marketplace(c594681e-de0b-4b39-98d3-573c9170c898): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.576660 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-j86kw" podUID="c594681e-de0b-4b39-98d3-573c9170c898" Feb 19 19:21:23 crc kubenswrapper[4722]: I0219 19:21:23.675696 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:21:23 crc kubenswrapper[4722]: I0219 19:21:23.675811 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:21:23 crc kubenswrapper[4722]: I0219 19:21:23.883448 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.879876 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-j86kw" podUID="c594681e-de0b-4b39-98d3-573c9170c898" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.941009 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.970924 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.971218 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vckt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p4576_openshift-marketplace(f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.973226 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p4576" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975041 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.975396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cd9afa-751d-46e0-b482-2098a89d2840" containerName="pruner" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975411 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cd9afa-751d-46e0-b482-2098a89d2840" containerName="pruner" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.975428 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975439 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975636 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cd9afa-751d-46e0-b482-2098a89d2840" containerName="pruner" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975680 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.976255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.983026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.033191 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.033335 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvbkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6tp9x_openshift-marketplace(396bbbdf-7f78-48e7-b02c-0737c221aaa6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.034716 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6tp9x" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034868 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034953 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.035823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.036103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.036161 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config" (OuterVolumeSpecName: "config") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.041993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.043119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk" (OuterVolumeSpecName: "kube-api-access-lzcqk") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "kube-api-access-lzcqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.072194 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.072306 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-665pv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hg6kw_openshift-marketplace(7ad9ab6b-efbe-4d01-97b0-281ee8a199df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.074031 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hg6kw" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.105802 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.105923 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9szc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-64frs_openshift-marketplace(0c9d3632-a132-4377-95ef-564cffb1f299): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.107331 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.138867 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.138938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139099 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139110 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139119 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139128 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139137 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.140914 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.155328 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s6hhp"] Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.192091 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.192269 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b987n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4tk99_openshift-marketplace(12054322-fe1e-4205-b6d3-05b30024a987): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.193436 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240476 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240633 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.241001 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.241336 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.241391 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config" (OuterVolumeSpecName: "config") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.242196 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.242641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.243592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.246044 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.246277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.250847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr" (OuterVolumeSpecName: "kube-api-access-54srr") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "kube-api-access-54srr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.258257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.280782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerDied","Data":"91e29c4c51cd956e7890c0dbe940cd28aaff5babb9d72cd9fb735cea262c06b2"} Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.280857 4722 scope.go:117] "RemoveContainer" containerID="c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.281083 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.283056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerDied","Data":"6e1b8dc29249f786b414083b626373283ac9d3f4f6727c121afc4a975d983b31"} Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.283196 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.284928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" event={"ID":"493acad5-7300-4941-9311-19b3d5f21786","Type":"ContainerStarted","Data":"90a81819bec14e2cc6ec1baaf5df2e5daf052397719f77199e24b15492b6f23a"} Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.287534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerStarted","Data":"a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236"} Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.288793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6tp9x" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.288967 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hg6kw" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.289812 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.290128 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p4576" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.290412 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.296541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.301785 4722 scope.go:117] "RemoveContainer" containerID="0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.340676 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.341949 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.341984 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.342003 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.342023 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.344544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.425994 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.431585 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.530556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:25 crc kubenswrapper[4722]: W0219 19:21:25.541669 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b15088_b052_4d3f_adca_61ff969d0699.slice/crio-ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf WatchSource:0}: Error finding container ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf: Status 404 returned error can't find the container with id ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.296324 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerStarted","Data":"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.296778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerStarted","Data":"ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.296801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.298048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" event={"ID":"493acad5-7300-4941-9311-19b3d5f21786","Type":"ContainerStarted","Data":"dd1d6ce5b730c6775283e4a7a31924f3ded3072999fc007734ab62952de32159"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.298075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" event={"ID":"493acad5-7300-4941-9311-19b3d5f21786","Type":"ContainerStarted","Data":"770592151c0712bb70350497af14f58fe44588b5ccfe02c05ab0268cd96a68f6"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.300340 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerID="a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236" exitCode=0 Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.300394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.304070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.323320 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" podStartSLOduration=17.323300519 podStartE2EDuration="17.323300519s" podCreationTimestamp="2026-02-19 19:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:26.321383778 +0000 UTC m=+185.933734102" watchObservedRunningTime="2026-02-19 19:21:26.323300519 +0000 UTC m=+185.935650843" Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.362218 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-s6hhp" podStartSLOduration=166.362183457 podStartE2EDuration="2m46.362183457s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:26.359599355 +0000 UTC m=+185.971949699" watchObservedRunningTime="2026-02-19 19:21:26.362183457 +0000 UTC m=+185.974533781" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.080345 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ded6995-db61-4962-a375-ba80816b8df9" path="/var/lib/kubelet/pods/4ded6995-db61-4962-a375-ba80816b8df9/volumes" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.081350 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1782da0-924a-481b-b0fc-20050e168591" path="/var/lib/kubelet/pods/c1782da0-924a-481b-b0fc-20050e168591/volumes" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.308818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerStarted","Data":"46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee"} Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.331810 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnljk" podStartSLOduration=2.26319013 podStartE2EDuration="35.331793706s" podCreationTimestamp="2026-02-19 19:20:52 +0000 UTC" firstStartedPulling="2026-02-19 19:20:53.724429612 +0000 UTC m=+153.336779936" lastFinishedPulling="2026-02-19 19:21:26.793033188 +0000 UTC m=+186.405383512" observedRunningTime="2026-02-19 19:21:27.328742779 +0000 UTC m=+186.941093113" watchObservedRunningTime="2026-02-19 19:21:27.331793706 +0000 UTC m=+186.944144030" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.541564 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:27 crc kubenswrapper[4722]: E0219 19:21:27.541892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.541915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.542087 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.542779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.545582 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.545582 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.547123 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.547450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.547486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.549297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.550873 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.672622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.672681 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.672724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.673023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.774747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.775160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.775205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.775233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.776277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.777030 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.782435 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.795826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.869746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:28 crc kubenswrapper[4722]: I0219 19:21:28.343128 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:28 crc kubenswrapper[4722]: I0219 19:21:28.631611 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.325572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerStarted","Data":"d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27"} Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.325651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerStarted","Data":"dbd20b6d66e3f4f6e61fa79b316492ac3959c655c54ad91cf639b4c0480d6e0e"} Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.325833 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.331411 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.342366 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" podStartSLOduration=20.342346302 podStartE2EDuration="20.342346302s" podCreationTimestamp="2026-02-19 19:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:29.341315479 +0000 UTC m=+188.953665823" watchObservedRunningTime="2026-02-19 19:21:29.342346302 +0000 UTC m=+188.954696626" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.431302 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.432086 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" containerID="cri-o://c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" gracePeriod=30 Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.878082 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.002833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.002914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.002960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.003033 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.003088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.003894 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca" (OuterVolumeSpecName: "client-ca") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.004009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.004016 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config" (OuterVolumeSpecName: "config") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.008626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.009374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd" (OuterVolumeSpecName: "kube-api-access-l8ldd") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "kube-api-access-l8ldd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104085 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104115 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104124 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104134 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104143 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331516 4722 generic.go:334] "Generic (PLEG): container finished" podID="c0b15088-b052-4d3f-adca-61ff969d0699" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" exitCode=0 Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerDied","Data":"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89"} Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerDied","Data":"ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf"} Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331692 4722 scope.go:117] "RemoveContainer" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.362270 4722 scope.go:117] "RemoveContainer" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" Feb 19 19:21:30 crc kubenswrapper[4722]: E0219 19:21:30.363266 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89\": container with ID starting with c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89 not found: ID does not exist" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.363322 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89"} err="failed to get container status \"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89\": rpc error: code = NotFound desc = could not find container \"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89\": container with ID starting with c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89 not found: ID does not exist" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.367391 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.369850 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.542228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:30 crc kubenswrapper[4722]: E0219 19:21:30.543506 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.543585 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.543721 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.545692 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.547839 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.548103 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.548158 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.549492 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.550128 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.550375 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.552393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.554824 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713250 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713397 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.814722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.816907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.817274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.817970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.819109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.840518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.898377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.086746 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" path="/var/lib/kubelet/pods/c0b15088-b052-4d3f-adca-61ff969d0699/volumes" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.091277 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:31 crc kubenswrapper[4722]: W0219 19:21:31.112375 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47caee59_bfc1_4d8b_89f9_f7e9dc92c22c.slice/crio-eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d WatchSource:0}: Error finding container eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d: Status 404 returned error can't find the container with id eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.336680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerStarted","Data":"7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc"} Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.337022 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.337033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerStarted","Data":"eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d"} Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.341981 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.355389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" podStartSLOduration=2.355375027 podStartE2EDuration="2.355375027s" podCreationTimestamp="2026-02-19 19:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:31.354254471 +0000 UTC m=+190.966604815" watchObservedRunningTime="2026-02-19 19:21:31.355375027 +0000 UTC m=+190.967725351" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.867733 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.868521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.870063 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.874917 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.874949 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.031039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.031139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.132152 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.132262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.132276 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.152242 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.189401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.606081 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 19:21:32 crc kubenswrapper[4722]: W0219 19:21:32.625243 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddf91f980_2d12_4c18_8f5f_91bf1a5b4136.slice/crio-2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88 WatchSource:0}: Error finding container 2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88: Status 404 returned error can't find the container with id 2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88 Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.848054 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.848107 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:33 crc kubenswrapper[4722]: I0219 19:21:33.347941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerStarted","Data":"0da74604871e34120ab12154e4c46a70f3a348702b25e1082ee896814de85bc4"} Feb 19 19:21:33 crc kubenswrapper[4722]: I0219 19:21:33.348261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerStarted","Data":"2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88"} Feb 19 19:21:33 crc kubenswrapper[4722]: I0219 19:21:33.998960 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rnljk" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" probeResult="failure" output=< Feb 19 19:21:33 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:21:33 crc kubenswrapper[4722]: > Feb 19 19:21:34 crc kubenswrapper[4722]: I0219 19:21:34.354012 4722 generic.go:334] "Generic (PLEG): container finished" podID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerID="0da74604871e34120ab12154e4c46a70f3a348702b25e1082ee896814de85bc4" exitCode=0 Feb 19 19:21:34 crc kubenswrapper[4722]: I0219 19:21:34.354055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerDied","Data":"0da74604871e34120ab12154e4c46a70f3a348702b25e1082ee896814de85bc4"} Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.652091 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785284 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df91f980-2d12-4c18-8f5f-91bf1a5b4136" (UID: "df91f980-2d12-4c18-8f5f-91bf1a5b4136"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785549 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.790948 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df91f980-2d12-4c18-8f5f-91bf1a5b4136" (UID: "df91f980-2d12-4c18-8f5f-91bf1a5b4136"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.886844 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.366851 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerDied","Data":"2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88"} Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.366903 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.366970 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.465131 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 19:21:36 crc kubenswrapper[4722]: E0219 19:21:36.465517 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerName="pruner" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.465545 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerName="pruner" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.465736 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerName="pruner" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.466356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.468412 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.471081 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.477231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.595081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.595126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.595219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.696648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.696733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.696763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.697320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.697377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.714436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.791570 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:37 crc kubenswrapper[4722]: I0219 19:21:37.175533 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 19:21:37 crc kubenswrapper[4722]: I0219 19:21:37.373094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerStarted","Data":"be8a061b191347417c7eff0e39c1d45a40ce52746371e25938f78f0f9a4f9e58"} Feb 19 19:21:38 crc kubenswrapper[4722]: I0219 19:21:38.379205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerStarted","Data":"1f68a7c9928e93107f9848c5151b976a7aa149617e7e965be09dba7a86508ed6"} Feb 19 19:21:38 crc kubenswrapper[4722]: I0219 19:21:38.394256 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.3942385809999998 podStartE2EDuration="2.394238581s" podCreationTimestamp="2026-02-19 19:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:38.391609158 +0000 UTC m=+198.003959532" watchObservedRunningTime="2026-02-19 19:21:38.394238581 +0000 UTC m=+198.006588925" Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.386414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerStarted","Data":"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081"} Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.389132 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3632-a132-4377-95ef-564cffb1f299" containerID="ecdd2f0fffaf519cc5830b6edc00c3c6f8ed2646ef4460850d3ebbfc25bad88c" exitCode=0 Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.389169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"ecdd2f0fffaf519cc5830b6edc00c3c6f8ed2646ef4460850d3ebbfc25bad88c"} Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.391198 4722 generic.go:334] "Generic (PLEG): container finished" podID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerID="fed968269de56954a9bf853304185d7d7e89b05c7032995e1f8430c840f32748" exitCode=0 Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.391589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"fed968269de56954a9bf853304185d7d7e89b05c7032995e1f8430c840f32748"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.399367 4722 generic.go:334] "Generic (PLEG): container finished" podID="c594681e-de0b-4b39-98d3-573c9170c898" containerID="66cd55d7e5fc27ab50c52a8a0d368159c8c115d8bef1d54037565d69fb207dbc" exitCode=0 Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.399739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"66cd55d7e5fc27ab50c52a8a0d368159c8c115d8bef1d54037565d69fb207dbc"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.402757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerStarted","Data":"5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.406453 4722 generic.go:334] "Generic (PLEG): container finished" podID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" exitCode=0 Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.406532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.408786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerStarted","Data":"ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.411098 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerID="2aa095dd8f535949977c905c9b49fee93638ecf8347aa83cac60afa0f336cc86" exitCode=0 Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.411212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"2aa095dd8f535949977c905c9b49fee93638ecf8347aa83cac60afa0f336cc86"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.414058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerStarted","Data":"d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.454428 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-64frs" podStartSLOduration=2.20180181 podStartE2EDuration="51.454410737s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:50.644310706 +0000 UTC m=+150.256661030" lastFinishedPulling="2026-02-19 19:21:39.896919643 +0000 UTC m=+199.509269957" observedRunningTime="2026-02-19 19:21:40.453929061 +0000 UTC m=+200.066279385" watchObservedRunningTime="2026-02-19 19:21:40.454410737 +0000 UTC m=+200.066761061" Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.506666 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vqqrf" podStartSLOduration=2.320538432 podStartE2EDuration="49.506645892s" podCreationTimestamp="2026-02-19 19:20:51 +0000 UTC" firstStartedPulling="2026-02-19 19:20:52.700302359 +0000 UTC m=+152.312652683" lastFinishedPulling="2026-02-19 19:21:39.886409819 +0000 UTC m=+199.498760143" observedRunningTime="2026-02-19 19:21:40.489304162 +0000 UTC m=+200.101654496" watchObservedRunningTime="2026-02-19 19:21:40.506645892 +0000 UTC m=+200.118996216" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.420679 4722 generic.go:334] "Generic (PLEG): container finished" podID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerID="d8aaa67a4ff9066de0c0fee741280169063042f7cb7d5dafb2624fc9902e5310" exitCode=0 Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.420762 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"d8aaa67a4ff9066de0c0fee741280169063042f7cb7d5dafb2624fc9902e5310"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.422920 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerStarted","Data":"e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.425235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerStarted","Data":"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.426772 4722 generic.go:334] "Generic (PLEG): container finished" podID="12054322-fe1e-4205-b6d3-05b30024a987" containerID="ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038" exitCode=0 Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.426803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.430037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerStarted","Data":"59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.467464 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j86kw" podStartSLOduration=2.305488215 podStartE2EDuration="52.467448264s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:50.648020574 +0000 UTC m=+150.260370898" lastFinishedPulling="2026-02-19 19:21:40.809980623 +0000 UTC m=+200.422330947" observedRunningTime="2026-02-19 19:21:41.465194432 +0000 UTC m=+201.077544776" watchObservedRunningTime="2026-02-19 19:21:41.467448264 +0000 UTC m=+201.079798588" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.491606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4576" podStartSLOduration=3.274298016 podStartE2EDuration="52.491590959s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:51.669812753 +0000 UTC m=+151.282163077" lastFinishedPulling="2026-02-19 19:21:40.887105706 +0000 UTC m=+200.499456020" observedRunningTime="2026-02-19 19:21:41.49070664 +0000 UTC m=+201.103056964" watchObservedRunningTime="2026-02-19 19:21:41.491590959 +0000 UTC m=+201.103941283" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.514461 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tp9x" podStartSLOduration=2.345124762 podStartE2EDuration="52.514444563s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:50.6519845 +0000 UTC m=+150.264334824" lastFinishedPulling="2026-02-19 19:21:40.821304301 +0000 UTC m=+200.433654625" observedRunningTime="2026-02-19 19:21:41.509995171 +0000 UTC m=+201.122345495" watchObservedRunningTime="2026-02-19 19:21:41.514444563 +0000 UTC m=+201.126794887" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798380 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798432 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798476 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798967 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.799037 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d" gracePeriod=600 Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.836317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.836361 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.436930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerStarted","Data":"8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.438645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerStarted","Data":"2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.440426 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d" exitCode=0 Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.440460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.440484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.457227 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hg6kw" podStartSLOduration=2.355493687 podStartE2EDuration="51.457212384s" podCreationTimestamp="2026-02-19 19:20:51 +0000 UTC" firstStartedPulling="2026-02-19 19:20:52.691902421 +0000 UTC m=+152.304252745" lastFinishedPulling="2026-02-19 19:21:41.793621118 +0000 UTC m=+201.405971442" observedRunningTime="2026-02-19 19:21:42.453590849 +0000 UTC m=+202.065941173" watchObservedRunningTime="2026-02-19 19:21:42.457212384 +0000 UTC m=+202.069562708" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.502382 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4tk99" podStartSLOduration=3.384705328 podStartE2EDuration="50.502362344s" podCreationTimestamp="2026-02-19 19:20:52 +0000 UTC" firstStartedPulling="2026-02-19 19:20:54.732913417 +0000 UTC m=+154.345263751" lastFinishedPulling="2026-02-19 19:21:41.850570443 +0000 UTC m=+201.462920767" observedRunningTime="2026-02-19 19:21:42.501669902 +0000 UTC m=+202.114020226" watchObservedRunningTime="2026-02-19 19:21:42.502362344 +0000 UTC m=+202.114712668" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.901271 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.943355 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" probeResult="failure" output=< Feb 19 19:21:42 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:21:42 crc kubenswrapper[4722]: > Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.951120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:43 crc kubenswrapper[4722]: I0219 19:21:43.243445 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:43 crc kubenswrapper[4722]: I0219 19:21:43.243521 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:44 crc kubenswrapper[4722]: I0219 19:21:44.281474 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" probeResult="failure" output=< Feb 19 19:21:44 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:21:44 crc kubenswrapper[4722]: > Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.410708 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.411553 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" containerID="cri-o://7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc" gracePeriod=30 Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.428333 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.428970 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" containerID="cri-o://d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27" gracePeriod=30 Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.666889 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.667246 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.707661 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.859684 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.859723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.922273 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.061749 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.061803 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.136328 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.293764 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.293835 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.364053 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.487366 4722 generic.go:334] "Generic (PLEG): container finished" podID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerID="d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27" exitCode=0 Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.487983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerDied","Data":"d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27"} Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.489253 4722 generic.go:334] "Generic (PLEG): container finished" podID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerID="7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc" exitCode=0 Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.489492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerDied","Data":"7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc"} Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.532241 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.541382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.542728 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.550342 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.005647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006798 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.007343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.007403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca" (OuterVolumeSpecName: "client-ca") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.007602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config" (OuterVolumeSpecName: "config") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.011981 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf" (OuterVolumeSpecName: "kube-api-access-xp4vf") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "kube-api-access-xp4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.012244 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.020406 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.053757 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:21:51 crc kubenswrapper[4722]: E0219 19:21:51.054258 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.054286 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: E0219 19:21:51.054318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.054326 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.056521 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.056583 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.068815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.069523 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108285 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108610 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108899 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108932 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108951 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108968 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108984 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.109365 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.110841 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config" (OuterVolumeSpecName: "config") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.123745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.123798 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn" (OuterVolumeSpecName: "kube-api-access-m8zpn") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "kube-api-access-m8zpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.209901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.209994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210134 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210274 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210293 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210310 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210325 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.211125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.211447 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.211754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.213851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.228965 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.381801 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.504801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerDied","Data":"dbd20b6d66e3f4f6e61fa79b316492ac3959c655c54ad91cf639b4c0480d6e0e"} Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.504830 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.504859 4722 scope.go:117] "RemoveContainer" containerID="d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.521979 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.522887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerDied","Data":"eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d"} Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.542618 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.562135 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.566293 4722 scope.go:117] "RemoveContainer" containerID="7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.567501 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.572411 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.576587 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.648494 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:21:51 crc kubenswrapper[4722]: W0219 19:21:51.654602 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b05f432_113b_41f2_8c75_ec167057d648.slice/crio-9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc WatchSource:0}: Error finding container 9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc: Status 404 returned error can't find the container with id 9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.888410 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.900218 4722 patch_prober.go:28] interesting pod/controller-manager-7dc7c9d8c5-cvcxr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.900266 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.946524 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.231010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.231366 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.530634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerStarted","Data":"125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5"} Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.530704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerStarted","Data":"9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc"} Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.532906 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4576" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" containerID="cri-o://59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef" gracePeriod=2 Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.536606 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.536823 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j86kw" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" containerID="cri-o://e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c" gracePeriod=2 Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.649364 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:52 crc kubenswrapper[4722]: E0219 19:21:52.901476 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc594681e_de0b_4b39_98d3_573c9170c898.slice/crio-e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3947bdb_5e5a_43c8_b23d_d5aa97ebaebe.slice/crio-59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.079010 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" path="/var/lib/kubelet/pods/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c/volumes" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.080325 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" path="/var/lib/kubelet/pods/c1e07a33-6b17-400a-9697-f6746b257c3b/volumes" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.334695 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.372399 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.543935 4722 generic.go:334] "Generic (PLEG): container finished" podID="c594681e-de0b-4b39-98d3-573c9170c898" containerID="e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c" exitCode=0 Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.543982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c"} Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.546697 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerID="59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef" exitCode=0 Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.546722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef"} Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.565978 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.566284 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" podStartSLOduration=4.566265236 podStartE2EDuration="4.566265236s" podCreationTimestamp="2026-02-19 19:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:53.564259193 +0000 UTC m=+213.176609517" watchObservedRunningTime="2026-02-19 19:21:53.566265236 +0000 UTC m=+213.178615570" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.567042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.571332 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.571625 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.571646 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.573733 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.573759 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.574024 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.581316 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.613320 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.638276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.739853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.740112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.740274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.740434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.741509 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.843884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.844274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.844919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.844992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.852058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.865501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.891899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.150821 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.325133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:21:54 crc kubenswrapper[4722]: W0219 19:21:54.336441 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22babc4_c86d_4152_8113_84595c89b271.slice/crio-94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d WatchSource:0}: Error finding container 94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d: Status 404 returned error can't find the container with id 94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.350250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"c594681e-de0b-4b39-98d3-573c9170c898\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.350310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"c594681e-de0b-4b39-98d3-573c9170c898\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.350383 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"c594681e-de0b-4b39-98d3-573c9170c898\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.353355 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities" (OuterVolumeSpecName: "utilities") pod "c594681e-de0b-4b39-98d3-573c9170c898" (UID: "c594681e-de0b-4b39-98d3-573c9170c898"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.356501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8" (OuterVolumeSpecName: "kube-api-access-bhxq8") pod "c594681e-de0b-4b39-98d3-573c9170c898" (UID: "c594681e-de0b-4b39-98d3-573c9170c898"). InnerVolumeSpecName "kube-api-access-bhxq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.412439 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.416605 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c594681e-de0b-4b39-98d3-573c9170c898" (UID: "c594681e-de0b-4b39-98d3-573c9170c898"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.451754 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.451789 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.451800 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.561401 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.561539 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.561566 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.562707 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities" (OuterVolumeSpecName: "utilities") pod "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" (UID: "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.564714 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.564760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"ff5ad27012e651ea99b2c5454cf7b789a1c44ed2c936a800e67aa01d7e7683b4"} Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.567221 4722 scope.go:117] "RemoveContainer" containerID="59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.567833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt" (OuterVolumeSpecName: "kube-api-access-6vckt") pod "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" (UID: "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe"). InnerVolumeSpecName "kube-api-access-6vckt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.571872 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.571937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"9b378dd4da61b5af99f5f93bba7c15d0d04355aa249d4e89b10b4d368ec3db4e"} Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.573866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerStarted","Data":"94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d"} Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.594412 4722 scope.go:117] "RemoveContainer" containerID="2aa095dd8f535949977c905c9b49fee93638ecf8347aa83cac60afa0f336cc86" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.608276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.610822 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.628232 4722 scope.go:117] "RemoveContainer" containerID="d8152997987bda50dd12277fbfbc9da38a131bf85945cd167cb7db72d9b9372b" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.642273 4722 scope.go:117] "RemoveContainer" containerID="e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.652074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" (UID: "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.658989 4722 scope.go:117] "RemoveContainer" containerID="66cd55d7e5fc27ab50c52a8a0d368159c8c115d8bef1d54037565d69fb207dbc" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.663109 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.663128 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.663137 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.678138 4722 scope.go:117] "RemoveContainer" containerID="83d63174a5dee0510e001a33beae280a6c56b7d09645762d8197fc6948f07c46" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.895033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.897321 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.936802 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.079247 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c594681e-de0b-4b39-98d3-573c9170c898" path="/var/lib/kubelet/pods/c594681e-de0b-4b39-98d3-573c9170c898/volumes" Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.079914 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" path="/var/lib/kubelet/pods/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe/volumes" Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.581335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerStarted","Data":"3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255"} Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.581481 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hg6kw" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" containerID="cri-o://8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502" gracePeriod=2 Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.599628 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" podStartSLOduration=6.599609411 podStartE2EDuration="6.599609411s" podCreationTimestamp="2026-02-19 19:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:55.596127161 +0000 UTC m=+215.208477485" watchObservedRunningTime="2026-02-19 19:21:55.599609411 +0000 UTC m=+215.211959725" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.590111 4722 generic.go:334] "Generic (PLEG): container finished" podID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerID="8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502" exitCode=0 Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.590141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502"} Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.590720 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.595626 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.647653 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.687644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.687708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.687781 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.689941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities" (OuterVolumeSpecName: "utilities") pod "7ad9ab6b-efbe-4d01-97b0-281ee8a199df" (UID: "7ad9ab6b-efbe-4d01-97b0-281ee8a199df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.696947 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv" (OuterVolumeSpecName: "kube-api-access-665pv") pod "7ad9ab6b-efbe-4d01-97b0-281ee8a199df" (UID: "7ad9ab6b-efbe-4d01-97b0-281ee8a199df"). InnerVolumeSpecName "kube-api-access-665pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.718743 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ad9ab6b-efbe-4d01-97b0-281ee8a199df" (UID: "7ad9ab6b-efbe-4d01-97b0-281ee8a199df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.789601 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.789644 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.789657 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.341006 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.341385 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" containerID="cri-o://2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e" gracePeriod=2 Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.598806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d"} Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.598882 4722 scope.go:117] "RemoveContainer" containerID="8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.599283 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.604453 4722 generic.go:334] "Generic (PLEG): container finished" podID="12054322-fe1e-4205-b6d3-05b30024a987" containerID="2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e" exitCode=0 Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.604558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e"} Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.629369 4722 scope.go:117] "RemoveContainer" containerID="d8aaa67a4ff9066de0c0fee741280169063042f7cb7d5dafb2624fc9902e5310" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.640533 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.646640 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.674257 4722 scope.go:117] "RemoveContainer" containerID="0dd65a739e9f5e8ad490009cf2eebc6f6859f0fe25f4e418d1b7a49467014a17" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.817466 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.903392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"12054322-fe1e-4205-b6d3-05b30024a987\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.903489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"12054322-fe1e-4205-b6d3-05b30024a987\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.903523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"12054322-fe1e-4205-b6d3-05b30024a987\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.904656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities" (OuterVolumeSpecName: "utilities") pod "12054322-fe1e-4205-b6d3-05b30024a987" (UID: "12054322-fe1e-4205-b6d3-05b30024a987"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.910323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n" (OuterVolumeSpecName: "kube-api-access-b987n") pod "12054322-fe1e-4205-b6d3-05b30024a987" (UID: "12054322-fe1e-4205-b6d3-05b30024a987"). InnerVolumeSpecName "kube-api-access-b987n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.004685 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.004724 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.041855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12054322-fe1e-4205-b6d3-05b30024a987" (UID: "12054322-fe1e-4205-b6d3-05b30024a987"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.106148 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.613602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b"} Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.613660 4722 scope.go:117] "RemoveContainer" containerID="2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.613770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.633684 4722 scope.go:117] "RemoveContainer" containerID="ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.649956 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.653767 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.666339 4722 scope.go:117] "RemoveContainer" containerID="57d551ccacbc04d55c2cac5a3bb7ceb078d63f2d275222bd8c776cbc6fad014d" Feb 19 19:21:59 crc kubenswrapper[4722]: I0219 19:21:59.081933 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12054322-fe1e-4205-b6d3-05b30024a987" path="/var/lib/kubelet/pods/12054322-fe1e-4205-b6d3-05b30024a987/volumes" Feb 19 19:21:59 crc kubenswrapper[4722]: I0219 19:21:59.082647 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" path="/var/lib/kubelet/pods/7ad9ab6b-efbe-4d01-97b0-281ee8a199df/volumes" Feb 19 19:22:01 crc kubenswrapper[4722]: I0219 19:22:01.382396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:01 crc kubenswrapper[4722]: I0219 19:22:01.389411 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:02 crc kubenswrapper[4722]: I0219 19:22:02.528536 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.412015 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.412571 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" containerID="cri-o://125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5" gracePeriod=30 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.437998 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.438274 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" containerID="cri-o://3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255" gracePeriod=30 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.691957 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b05f432-113b-41f2-8c75-ec167057d648" containerID="125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5" exitCode=0 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.692007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerDied","Data":"125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5"} Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.693611 4722 generic.go:334] "Generic (PLEG): container finished" podID="a22babc4-c86d-4152-8113-84595c89b271" containerID="3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255" exitCode=0 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.693646 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerDied","Data":"3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255"} Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.887009 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055436 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.056315 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca" (OuterVolumeSpecName: "client-ca") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.056372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config" (OuterVolumeSpecName: "config") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.066184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.067439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f" (OuterVolumeSpecName: "kube-api-access-pm66f") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "kube-api-access-pm66f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157010 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157040 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157051 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157079 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.468446 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.561784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562125 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562230 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.563082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.563117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.563189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config" (OuterVolumeSpecName: "config") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.565632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.565890 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb" (OuterVolumeSpecName: "kube-api-access-x7wvb") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "kube-api-access-x7wvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.586774 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k"] Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587130 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587183 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587206 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587219 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587232 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587243 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587256 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587268 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587291 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587304 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587321 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587333 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587350 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587360 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587373 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587383 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587397 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587408 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587462 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587490 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587502 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587517 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587528 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587539 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587550 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587564 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587575 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587750 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587775 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587795 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587809 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587820 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587840 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.588490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.594592 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.595733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.599138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.602406 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664212 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664249 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664260 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664271 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664283 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.701063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerDied","Data":"94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d"} Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.701107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.701137 4722 scope.go:117] "RemoveContainer" containerID="3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.702536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerDied","Data":"9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc"} Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.702646 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.721190 4722 scope.go:117] "RemoveContainer" containerID="125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.736171 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.741719 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.745529 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.748775 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-config\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-config\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f2cw\" (UniqueName: \"kubernetes.io/projected/dad06c7c-a6ab-40f3-860c-87def86419fd-kube-api-access-9f2cw\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-client-ca\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95c29\" (UniqueName: \"kubernetes.io/projected/053ce374-dacc-4077-a873-22ff300b8c46-kube-api-access-95c29\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-proxy-ca-bundles\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053ce374-dacc-4077-a873-22ff300b8c46-serving-cert\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-client-ca\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad06c7c-a6ab-40f3-860c-87def86419fd-serving-cert\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.866939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053ce374-dacc-4077-a873-22ff300b8c46-serving-cert\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-client-ca\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad06c7c-a6ab-40f3-860c-87def86419fd-serving-cert\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-config\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-config\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f2cw\" (UniqueName: \"kubernetes.io/projected/dad06c7c-a6ab-40f3-860c-87def86419fd-kube-api-access-9f2cw\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-client-ca\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95c29\" (UniqueName: \"kubernetes.io/projected/053ce374-dacc-4077-a873-22ff300b8c46-kube-api-access-95c29\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-proxy-ca-bundles\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.868806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-client-ca\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.869322 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-proxy-ca-bundles\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.869471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-client-ca\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.870134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-config\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.870703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-config\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.874135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad06c7c-a6ab-40f3-860c-87def86419fd-serving-cert\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.875700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053ce374-dacc-4077-a873-22ff300b8c46-serving-cert\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.888637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95c29\" (UniqueName: \"kubernetes.io/projected/053ce374-dacc-4077-a873-22ff300b8c46-kube-api-access-95c29\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.896729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f2cw\" (UniqueName: \"kubernetes.io/projected/dad06c7c-a6ab-40f3-860c-87def86419fd-kube-api-access-9f2cw\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.938423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.944678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.080080 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b05f432-113b-41f2-8c75-ec167057d648" path="/var/lib/kubelet/pods/0b05f432-113b-41f2-8c75-ec167057d648/volumes" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.080768 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22babc4-c86d-4152-8113-84595c89b271" path="/var/lib/kubelet/pods/a22babc4-c86d-4152-8113-84595c89b271/volumes" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.396676 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k"] Feb 19 19:22:11 crc kubenswrapper[4722]: W0219 19:22:11.407945 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053ce374_dacc_4077_a873_22ff300b8c46.slice/crio-33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06 WatchSource:0}: Error finding container 33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06: Status 404 returned error can't find the container with id 33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06 Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.444655 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c"] Feb 19 19:22:11 crc kubenswrapper[4722]: W0219 19:22:11.450407 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad06c7c_a6ab_40f3_860c_87def86419fd.slice/crio-24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806 WatchSource:0}: Error finding container 24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806: Status 404 returned error can't find the container with id 24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806 Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.708848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" event={"ID":"053ce374-dacc-4077-a873-22ff300b8c46","Type":"ContainerStarted","Data":"48d6446db2fb5432fc773ac6bf2aa0912885ad6a44d6c26f955f6d0843d51080"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.708900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" event={"ID":"053ce374-dacc-4077-a873-22ff300b8c46","Type":"ContainerStarted","Data":"33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.709060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.710125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" event={"ID":"dad06c7c-a6ab-40f3-860c-87def86419fd","Type":"ContainerStarted","Data":"b9621a4da94da727c914810f827019ed3e57eb5725fea68ed16eb893370a0736"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.710194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" event={"ID":"dad06c7c-a6ab-40f3-860c-87def86419fd","Type":"ContainerStarted","Data":"24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.710382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.715125 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.730865 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" podStartSLOduration=2.730839022 podStartE2EDuration="2.730839022s" podCreationTimestamp="2026-02-19 19:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:11.727246898 +0000 UTC m=+231.339597222" watchObservedRunningTime="2026-02-19 19:22:11.730839022 +0000 UTC m=+231.343189366" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.756033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" podStartSLOduration=2.756003859 podStartE2EDuration="2.756003859s" podCreationTimestamp="2026-02-19 19:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:11.750537676 +0000 UTC m=+231.362888000" watchObservedRunningTime="2026-02-19 19:22:11.756003859 +0000 UTC m=+231.368354193" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.894942 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.398257 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399195 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399295 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399348 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399391 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399571 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399597 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.399912 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399932 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.399950 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.399980 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399992 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400019 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400032 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400049 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400061 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400082 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400094 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400109 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400122 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400299 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400321 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400334 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400351 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400369 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400394 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.402801 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.403261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.408363 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.440624 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531877 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.532033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.532195 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.532311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633676 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.634019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.634075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.634139 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.734654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.747868 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.749682 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750512 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750557 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750568 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750575 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" exitCode=2 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750635 4722 scope.go:117] "RemoveContainer" containerID="e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.753820 4722 generic.go:334] "Generic (PLEG): container finished" podID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerID="1f68a7c9928e93107f9848c5151b976a7aa149617e7e965be09dba7a86508ed6" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.753889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerDied","Data":"1f68a7c9928e93107f9848c5151b976a7aa149617e7e965be09dba7a86508ed6"} Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.755107 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.756349 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:15 crc kubenswrapper[4722]: W0219 19:22:15.770081 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a WatchSource:0}: Error finding container 6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a: Status 404 returned error can't find the container with id 6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.773335 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895bc2e5e0bf27d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,LastTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.763409 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.766930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103"} Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.767012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a"} Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.768320 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.768753 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.145183 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.146235 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.146625 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.254931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255074 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d98fac92-53aa-469c-b47e-4cc6edd91ef7" (UID: "d98fac92-53aa-469c-b47e-4cc6edd91ef7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock" (OuterVolumeSpecName: "var-lock") pod "d98fac92-53aa-469c-b47e-4cc6edd91ef7" (UID: "d98fac92-53aa-469c-b47e-4cc6edd91ef7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.256002 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.256047 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.261333 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d98fac92-53aa-469c-b47e-4cc6edd91ef7" (UID: "d98fac92-53aa-469c-b47e-4cc6edd91ef7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.357268 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.771332 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.772556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.772928 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.772993 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773218 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773393 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773756 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" exitCode=0 Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773810 4722 scope.go:117] "RemoveContainer" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.774890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerDied","Data":"be8a061b191347417c7eff0e39c1d45a40ce52746371e25938f78f0f9a4f9e58"} Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.774912 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8a061b191347417c7eff0e39c1d45a40ce52746371e25938f78f0f9a4f9e58" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.774930 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.787940 4722 scope.go:117] "RemoveContainer" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.789682 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.789914 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.790162 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.804699 4722 scope.go:117] "RemoveContainer" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.817226 4722 scope.go:117] "RemoveContainer" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.831937 4722 scope.go:117] "RemoveContainer" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.850264 4722 scope.go:117] "RemoveContainer" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863680 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863697 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864185 4722 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864207 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864220 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864392 4722 scope.go:117] "RemoveContainer" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.864978 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\": container with ID starting with 0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae not found: ID does not exist" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865004 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae"} err="failed to get container status \"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\": rpc error: code = NotFound desc = could not find container \"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\": container with ID starting with 0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865023 4722 scope.go:117] "RemoveContainer" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.865429 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\": container with ID starting with 299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae not found: ID does not exist" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae"} err="failed to get container status \"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\": rpc error: code = NotFound desc = could not find container \"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\": container with ID starting with 299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865467 4722 scope.go:117] "RemoveContainer" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.865746 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\": container with ID starting with b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003 not found: ID does not exist" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865764 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003"} err="failed to get container status \"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\": rpc error: code = NotFound desc = could not find container \"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\": container with ID starting with b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003 not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865778 4722 scope.go:117] "RemoveContainer" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.866021 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\": container with ID starting with c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb not found: ID does not exist" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866042 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb"} err="failed to get container status \"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\": rpc error: code = NotFound desc = could not find container \"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\": container with ID starting with c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866056 4722 scope.go:117] "RemoveContainer" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.866288 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\": container with ID starting with e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a not found: ID does not exist" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866308 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a"} err="failed to get container status \"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\": rpc error: code = NotFound desc = could not find container \"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\": container with ID starting with e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866320 4722 scope.go:117] "RemoveContainer" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.866501 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\": container with ID starting with 6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d not found: ID does not exist" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866522 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d"} err="failed to get container status \"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\": rpc error: code = NotFound desc = could not find container \"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\": container with ID starting with 6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d not found: ID does not exist" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.781278 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.796572 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.796875 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.797064 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:19 crc kubenswrapper[4722]: I0219 19:22:19.077737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 19:22:19 crc kubenswrapper[4722]: E0219 19:22:19.634736 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895bc2e5e0bf27d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,LastTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:22:21 crc kubenswrapper[4722]: I0219 19:22:21.079225 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: I0219 19:22:21.079438 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.935559 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.936474 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.937028 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.937725 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.938130 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: I0219 19:22:21.938172 4722 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.938459 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Feb 19 19:22:22 crc kubenswrapper[4722]: E0219 19:22:22.140298 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Feb 19 19:22:22 crc kubenswrapper[4722]: E0219 19:22:22.541537 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Feb 19 19:22:23 crc kubenswrapper[4722]: E0219 19:22:23.342745 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Feb 19 19:22:24 crc kubenswrapper[4722]: E0219 19:22:24.944341 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.071273 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.072303 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.072654 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.093043 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.093077 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: E0219 19:22:26.093441 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.093888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: W0219 19:22:26.135690 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671 WatchSource:0}: Error finding container 8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671: Status 404 returned error can't find the container with id 8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671 Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.835207 4722 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0592a4af60ae44c6f38001780aebfe314c6fffcbec293bd9f92aa6cfc99936f9" exitCode=0 Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.835330 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0592a4af60ae44c6f38001780aebfe314c6fffcbec293bd9f92aa6cfc99936f9"} Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.835939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671"} Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.836362 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.836466 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: E0219 19:22:26.837124 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.837573 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.838105 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.554645 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" containerID="cri-o://025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e" gracePeriod=15 Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.848402 4722 generic.go:334] "Generic (PLEG): container finished" podID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerID="025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e" exitCode=0 Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.849277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerDied","Data":"025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e"} Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.867078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"216074f71ec1f1180401fdc600d91e1d4aa94547ec510e042aba30fce443a118"} Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.867113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1f75f0329981877025ef171683d0e8b983576e28f3b7448907e7c35bd6b37efc"} Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.867121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ed6ef704a2ba61c71f9580c8186d27acaf49ec150b28d5e968224a0f9f4b14c"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.067737 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208589 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208902 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208979 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.209013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.209084 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.209508 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.210475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.210745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.211047 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.212686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.215108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.215565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.215999 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m" (OuterVolumeSpecName: "kube-api-access-qjd4m") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "kube-api-access-qjd4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.216395 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217035 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.228641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310572 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310603 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310612 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310621 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310630 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310640 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310652 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310660 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310671 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310679 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310687 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310695 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310716 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310724 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.882247 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerDied","Data":"e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.882299 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.882310 4722 scope.go:117] "RemoveContainer" containerID="025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.884978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08f3fed92446bcc87c06e72bb6a1f3f2ebcf3ec1329c968e778029b00a0dae40"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7ae6cb64355eb99531448a0ad974367a29a320a1b0434b7d243ade2b753bfd96"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885343 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885366 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885567 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.904040 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.904440 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c" exitCode=1 Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.904497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c"} Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.905382 4722 scope.go:117] "RemoveContainer" containerID="985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c" Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.992730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.095298 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.095379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.100887 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.915511 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.915950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329"} Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.964718 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.901190 4722 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.931754 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.931812 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.936937 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.939550 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="68634038-94c9-4bb4-a05e-3d47197c3f1e" Feb 19 19:22:34 crc kubenswrapper[4722]: I0219 19:22:34.937213 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:34 crc kubenswrapper[4722]: I0219 19:22:34.937267 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.580808 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.582806 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.582906 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.607986 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.969080 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.087670 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.088729 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="68634038-94c9-4bb4-a05e-3d47197c3f1e" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.376002 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.389871 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.679765 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.944594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.080040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.112340 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.118772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=27.118748037 podStartE2EDuration="27.118748037s" podCreationTimestamp="2026-02-19 19:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:33.746127609 +0000 UTC m=+253.358477933" watchObservedRunningTime="2026-02-19 19:22:42.118748037 +0000 UTC m=+261.731098411" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.121049 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.121133 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.128556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.144064 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.144044608 podStartE2EDuration="9.144044608s" podCreationTimestamp="2026-02-19 19:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:42.142945674 +0000 UTC m=+261.755296038" watchObservedRunningTime="2026-02-19 19:22:42.144044608 +0000 UTC m=+261.756394932" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.309345 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.867684 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.874647 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.906878 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.987605 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.080691 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" path="/var/lib/kubelet/pods/ecc880c8-beb9-4081-8af6-64d2fa857901/volumes" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.143888 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.369015 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.963262 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.017753 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.166849 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.182004 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.424572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.811863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.912002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.078815 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.079010 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" gracePeriod=5 Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.125764 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.132734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.559042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.624753 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.841644 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.888401 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.941588 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364120 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-sf28m"] Feb 19 19:22:46 crc kubenswrapper[4722]: E0219 19:22:46.364361 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364375 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 19:22:46 crc kubenswrapper[4722]: E0219 19:22:46.364391 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364398 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" Feb 19 19:22:46 crc kubenswrapper[4722]: E0219 19:22:46.364407 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerName="installer" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerName="installer" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364528 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364546 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerName="installer" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364555 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.368368 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.369771 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.369968 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.370367 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.370851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.372097 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.372526 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.372949 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.373221 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.374211 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.378266 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.380970 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.381652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.387733 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.400102 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538335 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzl2b\" (UniqueName: \"kubernetes.io/projected/473612c5-4d08-4767-adb9-4bfe5d8a05f1-kube-api-access-qzl2b\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538515 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-dir\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538666 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-policies\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538937 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-dir\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-policies\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640805 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-dir\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzl2b\" (UniqueName: \"kubernetes.io/projected/473612c5-4d08-4767-adb9-4bfe5d8a05f1-kube-api-access-qzl2b\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641201 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.642757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-policies\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.644304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.645906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.649236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.649537 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.650043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.651070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.651802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.657424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.657802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.658874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.661174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzl2b\" (UniqueName: \"kubernetes.io/projected/473612c5-4d08-4767-adb9-4bfe5d8a05f1-kube-api-access-qzl2b\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.694971 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.709815 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.778466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.941306 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.028003 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.042228 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.107918 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.275130 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.395612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.421053 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.454500 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.623934 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.819999 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.924014 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.085982 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.346364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.489794 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.629293 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.684289 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.871250 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.007432 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.354560 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.381144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.476599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.681001 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.768830 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.845343 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.855920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.876696 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.058322 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.110609 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.179528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.295314 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.337394 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.355257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.390902 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.421655 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.529516 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.581253 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.581317 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.652854 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.675303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.675386 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.689470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.762591 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.784337 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793410 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793438 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793787 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794013 4722 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794028 4722 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794039 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794052 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.800745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.803698 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.826869 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.883278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.887893 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.896041 4722 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.896206 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.931736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.980284 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029738 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029793 4722 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" exitCode=137 Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029836 4722 scope.go:117] "RemoveContainer" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029869 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.047976 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.049053 4722 scope.go:117] "RemoveContainer" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" Feb 19 19:22:51 crc kubenswrapper[4722]: E0219 19:22:51.049445 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103\": container with ID starting with 91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103 not found: ID does not exist" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.049477 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103"} err="failed to get container status \"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103\": rpc error: code = NotFound desc = could not find container \"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103\": container with ID starting with 91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103 not found: ID does not exist" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.081458 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.081823 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.094035 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.094069 4722 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c4acea01-a073-46c4-bacf-1743a4f16e02" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.099079 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.099120 4722 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c4acea01-a073-46c4-bacf-1743a4f16e02" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.277790 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.299361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.398023 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.400745 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.424334 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.446044 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.531040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.552862 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.560333 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.584227 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.635265 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.725851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.788371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.827232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.855697 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.890763 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.038759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.066208 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.167786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.228307 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.268716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.319615 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.338979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.363307 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.451933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.492064 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.560144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.572046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.575023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.603602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.701017 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.732947 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.758130 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.875738 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.966577 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.029845 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.116391 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.194423 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.202451 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.209471 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.221431 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.332419 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.444450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.562642 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.573524 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.579527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.599488 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.638712 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.771204 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.952971 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.288380 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.376533 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.465383 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.503323 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.527113 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.611485 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.622730 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.635800 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.652320 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.666870 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.742432 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.770210 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-sf28m"] Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.772367 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.774026 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.854292 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.187853 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.210087 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.221935 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.323086 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.332318 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.515311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.527477 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.563725 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576639 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 19:22:55 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f" Netns:"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:55 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:55 crc kubenswrapper[4722]: > Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576710 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 19:22:55 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f" Netns:"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:55 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:55 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576731 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 19 19:22:55 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f" Netns:"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:55 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:55 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576778 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f\\\" Netns:\\\"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod \\\"oauth-openshift-58b6dc46cc-sf28m\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" podUID="473612c5-4d08-4767-adb9-4bfe5d8a05f1" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.629901 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.641324 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.656486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.832272 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.862853 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.899626 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.016676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.062186 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.062970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.109222 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.133849 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.214744 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.238682 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.249969 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.257166 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.349378 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.382933 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.679374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.772863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.871867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.935974 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.030536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.073354 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.081606 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.185866 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.216034 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.229604 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.244484 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.298215 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.531409 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.552576 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.576209 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.588632 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.697669 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.864062 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.086640 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.114962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.127495 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.161495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.184107 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.185450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.319023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.348418 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.493566 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.605292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.610287 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.641264 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.668278 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.771525 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.858472 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.901766 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.907432 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.945760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.083478 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.158784 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.163924 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.219578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.238692 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258569 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 19:22:59 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621" Netns:"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:59 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:59 crc kubenswrapper[4722]: > Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258799 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 19:22:59 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621" Netns:"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:59 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:59 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258825 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 19 19:22:59 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621" Netns:"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:59 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:59 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621\\\" Netns:\\\"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod \\\"oauth-openshift-58b6dc46cc-sf28m\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" podUID="473612c5-4d08-4767-adb9-4bfe5d8a05f1" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.373940 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.527716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.665607 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.670602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.773448 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.822035 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.972498 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.060135 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.295844 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.457920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.581786 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.581892 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.581968 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.583589 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.584452 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329" gracePeriod=30 Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.649483 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.691758 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.767574 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.800500 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.833081 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.928144 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.075645 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.279701 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.334402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.680839 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.846858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.944633 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.093914 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.107495 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.142464 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.310105 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.779515 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.781300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.783494 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.954820 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 19:23:12 crc kubenswrapper[4722]: I0219 19:23:12.070364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:12 crc kubenswrapper[4722]: I0219 19:23:12.071497 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:12 crc kubenswrapper[4722]: I0219 19:23:12.502271 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-sf28m"] Feb 19 19:23:12 crc kubenswrapper[4722]: W0219 19:23:12.510251 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod473612c5_4d08_4767_adb9_4bfe5d8a05f1.slice/crio-0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4 WatchSource:0}: Error finding container 0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4: Status 404 returned error can't find the container with id 0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4 Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.171327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" event={"ID":"473612c5-4d08-4767-adb9-4bfe5d8a05f1","Type":"ContainerStarted","Data":"998f033948c34dc1661b15f368dc8dbd76aaf915eba4c2403bcea03a739e915f"} Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.171770 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.171791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" event={"ID":"473612c5-4d08-4767-adb9-4bfe5d8a05f1","Type":"ContainerStarted","Data":"0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4"} Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.180722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.205220 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" podStartSLOduration=71.205198606 podStartE2EDuration="1m11.205198606s" podCreationTimestamp="2026-02-19 19:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:23:13.197755195 +0000 UTC m=+292.810105529" watchObservedRunningTime="2026-02-19 19:23:13.205198606 +0000 UTC m=+292.817548960" Feb 19 19:23:18 crc kubenswrapper[4722]: I0219 19:23:18.211696 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerID="3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145" exitCode=0 Feb 19 19:23:18 crc kubenswrapper[4722]: I0219 19:23:18.212262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerDied","Data":"3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145"} Feb 19 19:23:18 crc kubenswrapper[4722]: I0219 19:23:18.212936 4722 scope.go:117] "RemoveContainer" containerID="3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145" Feb 19 19:23:19 crc kubenswrapper[4722]: I0219 19:23:19.220015 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerStarted","Data":"1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc"} Feb 19 19:23:19 crc kubenswrapper[4722]: I0219 19:23:19.220634 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:23:19 crc kubenswrapper[4722]: I0219 19:23:19.222959 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:23:20 crc kubenswrapper[4722]: I0219 19:23:20.879476 4722 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.286796 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289205 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289260 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329" exitCode=137 Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329"} Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289325 4722 scope.go:117] "RemoveContainer" containerID="985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c" Feb 19 19:23:32 crc kubenswrapper[4722]: I0219 19:23:32.296586 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 19 19:23:32 crc kubenswrapper[4722]: I0219 19:23:32.297927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"14d170beba3bf45db7aa9b935595723d9de04df3941a7da55d0113f9e65c3d49"} Feb 19 19:23:40 crc kubenswrapper[4722]: I0219 19:23:40.581735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:40 crc kubenswrapper[4722]: I0219 19:23:40.588071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:41 crc kubenswrapper[4722]: I0219 19:23:41.351640 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:41 crc kubenswrapper[4722]: I0219 19:23:41.355396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:24:11 crc kubenswrapper[4722]: I0219 19:24:11.798874 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:24:11 crc kubenswrapper[4722]: I0219 19:24:11.799464 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.278786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-whpmj"] Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.280494 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.297861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-whpmj"] Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300771 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-certificates\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-bound-sa-token\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-trusted-ca\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-tls\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301076 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqxm\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-kube-api-access-9nqxm\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.330829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-certificates\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-bound-sa-token\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-trusted-ca\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-tls\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqxm\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-kube-api-access-9nqxm\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.403145 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.403814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-trusted-ca\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.403940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-certificates\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.408898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-tls\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.409791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.421896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqxm\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-kube-api-access-9nqxm\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.430875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-bound-sa-token\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.598131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.811851 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-whpmj"] Feb 19 19:24:19 crc kubenswrapper[4722]: I0219 19:24:19.574432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" event={"ID":"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc","Type":"ContainerStarted","Data":"37defc6843c54ec20eea2e2778d80087362ecf2eb99fd8d2a18d8480bec583c0"} Feb 19 19:24:19 crc kubenswrapper[4722]: I0219 19:24:19.574917 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:19 crc kubenswrapper[4722]: I0219 19:24:19.574941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" event={"ID":"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc","Type":"ContainerStarted","Data":"46fd38d27d001eb42e066728c63250b9b035822d4354987401399c0d13036dad"} Feb 19 19:24:38 crc kubenswrapper[4722]: I0219 19:24:38.611487 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:38 crc kubenswrapper[4722]: I0219 19:24:38.645351 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" podStartSLOduration=20.645326155 podStartE2EDuration="20.645326155s" podCreationTimestamp="2026-02-19 19:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:24:19.613521893 +0000 UTC m=+359.225872257" watchObservedRunningTime="2026-02-19 19:24:38.645326155 +0000 UTC m=+378.257676509" Feb 19 19:24:38 crc kubenswrapper[4722]: I0219 19:24:38.695590 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.206504 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.206830 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tp9x" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" containerID="cri-o://8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.231077 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.231694 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" containerID="cri-o://d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.243728 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.243986 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" containerID="cri-o://1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.260844 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.261127 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" containerID="cri-o://5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.270121 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lrwfz"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.270960 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.285131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.285397 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rnljk" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" containerID="cri-o://46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.289503 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lrwfz"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.329567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwvqv\" (UniqueName: \"kubernetes.io/projected/6fb12d29-ac35-4e04-a25d-05b1b2545b81-kube-api-access-qwvqv\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.329651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.329709 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.431590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwvqv\" (UniqueName: \"kubernetes.io/projected/6fb12d29-ac35-4e04-a25d-05b1b2545b81-kube-api-access-qwvqv\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.431678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.431728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.433085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.449846 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.456525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwvqv\" (UniqueName: \"kubernetes.io/projected/6fb12d29-ac35-4e04-a25d-05b1b2545b81-kube-api-access-qwvqv\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.597746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.601451 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.633201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.633271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.633854 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.634344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities" (OuterVolumeSpecName: "utilities") pod "396bbbdf-7f78-48e7-b02c-0737c221aaa6" (UID: "396bbbdf-7f78-48e7-b02c-0737c221aaa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.634684 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.637095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr" (OuterVolumeSpecName: "kube-api-access-nvbkr") pod "396bbbdf-7f78-48e7-b02c-0737c221aaa6" (UID: "396bbbdf-7f78-48e7-b02c-0737c221aaa6"). InnerVolumeSpecName "kube-api-access-nvbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.668417 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.668952 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.669646 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.669696 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.696449 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "396bbbdf-7f78-48e7-b02c-0737c221aaa6" (UID: "396bbbdf-7f78-48e7-b02c-0737c221aaa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720266 4722 generic.go:334] "Generic (PLEG): container finished" podID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720351 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"8c40a4539d5d6930a5a906cb44965a1810a1f2192dbfb01db14eeaf97f5cc6ee"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720415 4722 scope.go:117] "RemoveContainer" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.730576 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3632-a132-4377-95ef-564cffb1f299" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.730699 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.737823 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.737864 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.740784 4722 generic.go:334] "Generic (PLEG): container finished" podID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerID="5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.740848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.744994 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerID="46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.745093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.752690 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerID="1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.752732 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerDied","Data":"1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.757092 4722 scope.go:117] "RemoveContainer" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.757777 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.763635 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.785037 4722 scope.go:117] "RemoveContainer" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.800305 4722 scope.go:117] "RemoveContainer" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.800797 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75\": container with ID starting with 8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75 not found: ID does not exist" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.800827 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75"} err="failed to get container status \"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75\": rpc error: code = NotFound desc = could not find container \"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75\": container with ID starting with 8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75 not found: ID does not exist" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.800852 4722 scope.go:117] "RemoveContainer" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.801137 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081\": container with ID starting with df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081 not found: ID does not exist" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801174 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081"} err="failed to get container status \"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081\": rpc error: code = NotFound desc = could not find container \"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081\": container with ID starting with df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081 not found: ID does not exist" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801271 4722 scope.go:117] "RemoveContainer" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.801702 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6\": container with ID starting with 4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6 not found: ID does not exist" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801728 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6"} err="failed to get container status \"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6\": rpc error: code = NotFound desc = could not find container \"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6\": container with ID starting with 4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6 not found: ID does not exist" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801745 4722 scope.go:117] "RemoveContainer" containerID="3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.313143 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.328622 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.347840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"0c9d3632-a132-4377-95ef-564cffb1f299\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.347964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348117 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"0c9d3632-a132-4377-95ef-564cffb1f299\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348222 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"0c9d3632-a132-4377-95ef-564cffb1f299\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348462 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities" (OuterVolumeSpecName: "utilities") pod "0c9d3632-a132-4377-95ef-564cffb1f299" (UID: "0c9d3632-a132-4377-95ef-564cffb1f299"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348936 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.349528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities" (OuterVolumeSpecName: "utilities") pod "f10dae1c-d938-4cce-893b-4ad7eca7d23f" (UID: "f10dae1c-d938-4cce-893b-4ad7eca7d23f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.358200 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5" (OuterVolumeSpecName: "kube-api-access-9szc5") pod "0c9d3632-a132-4377-95ef-564cffb1f299" (UID: "0c9d3632-a132-4377-95ef-564cffb1f299"). InnerVolumeSpecName "kube-api-access-9szc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.358632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5" (OuterVolumeSpecName: "kube-api-access-fz4g5") pod "f10dae1c-d938-4cce-893b-4ad7eca7d23f" (UID: "f10dae1c-d938-4cce-893b-4ad7eca7d23f"). InnerVolumeSpecName "kube-api-access-fz4g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.373794 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f10dae1c-d938-4cce-893b-4ad7eca7d23f" (UID: "f10dae1c-d938-4cce-893b-4ad7eca7d23f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.405091 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c9d3632-a132-4377-95ef-564cffb1f299" (UID: "0c9d3632-a132-4377-95ef-564cffb1f299"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.408132 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.409727 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449360 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449438 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"cb6886b7-9193-4c89-96c8-64b61c3251a4\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"cb6886b7-9193-4c89-96c8-64b61c3251a4\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449581 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449606 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"cb6886b7-9193-4c89-96c8-64b61c3251a4\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449819 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449835 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449847 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449857 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449891 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.450114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities" (OuterVolumeSpecName: "utilities") pod "2bb14baa-8bfc-415a-aa95-50b79f3c75ea" (UID: "2bb14baa-8bfc-415a-aa95-50b79f3c75ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.450404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cb6886b7-9193-4c89-96c8-64b61c3251a4" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.453006 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p" (OuterVolumeSpecName: "kube-api-access-sbw2p") pod "2bb14baa-8bfc-415a-aa95-50b79f3c75ea" (UID: "2bb14baa-8bfc-415a-aa95-50b79f3c75ea"). InnerVolumeSpecName "kube-api-access-sbw2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.453299 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64" (OuterVolumeSpecName: "kube-api-access-z8z64") pod "cb6886b7-9193-4c89-96c8-64b61c3251a4" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4"). InnerVolumeSpecName "kube-api-access-z8z64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.454693 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cb6886b7-9193-4c89-96c8-64b61c3251a4" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.490582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lrwfz"] Feb 19 19:24:40 crc kubenswrapper[4722]: W0219 19:24:40.492309 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb12d29_ac35_4e04_a25d_05b1b2545b81.slice/crio-5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50 WatchSource:0}: Error finding container 5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50: Status 404 returned error can't find the container with id 5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550728 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550756 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550768 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550777 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550786 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.602829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bb14baa-8bfc-415a-aa95-50b79f3c75ea" (UID: "2bb14baa-8bfc-415a-aa95-50b79f3c75ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.651816 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.759476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" event={"ID":"6fb12d29-ac35-4e04-a25d-05b1b2545b81","Type":"ContainerStarted","Data":"29bdb6089b8cc7071e4915a0ffa320ee1fb628b638ea3c2c0929496c59394785"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.759525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" event={"ID":"6fb12d29-ac35-4e04-a25d-05b1b2545b81","Type":"ContainerStarted","Data":"5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.759952 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.760574 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lrwfz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.760614 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" podUID="6fb12d29-ac35-4e04-a25d-05b1b2545b81" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.761762 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"d33d192f020b6508198a4a19887938ad42d94be353afef74a8413b4aa30e91d1"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.761805 4722 scope.go:117] "RemoveContainer" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.761878 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.763649 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.763658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"104233a8c5f814fc84e4081cc01af39a90044fcd055492fd733214b7e3b634d4"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.769639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"1c1bf847d9c8bd6cdac4a8d78654087bcd70cd49df2904b71c207590aa5bdd28"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.769689 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.773391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerDied","Data":"d0c096f9abea14bd89e01cd5df78cfd43109b66f0678b624949e1ec87cdc1cd4"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.773459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.782951 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" podStartSLOduration=1.782933011 podStartE2EDuration="1.782933011s" podCreationTimestamp="2026-02-19 19:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:24:40.781524627 +0000 UTC m=+380.393874981" watchObservedRunningTime="2026-02-19 19:24:40.782933011 +0000 UTC m=+380.395283325" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.796132 4722 scope.go:117] "RemoveContainer" containerID="ecdd2f0fffaf519cc5830b6edc00c3c6f8ed2646ef4460850d3ebbfc25bad88c" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.822592 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.825271 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.829007 4722 scope.go:117] "RemoveContainer" containerID="83c9ec76be9f3502d89c676d78e714eeea9b0340976175aeadfd0dc3726f4500" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.832948 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.839489 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.852012 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.856191 4722 scope.go:117] "RemoveContainer" containerID="5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.859987 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.864065 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.867654 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.874537 4722 scope.go:117] "RemoveContainer" containerID="fed968269de56954a9bf853304185d7d7e89b05c7032995e1f8430c840f32748" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.889230 4722 scope.go:117] "RemoveContainer" containerID="b5c97b5b76e7afa24f8f93363368d20e4563b18ad7e8eaf0a0672fe76a243f0a" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.915800 4722 scope.go:117] "RemoveContainer" containerID="46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.932021 4722 scope.go:117] "RemoveContainer" containerID="a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.958235 4722 scope.go:117] "RemoveContainer" containerID="78d9b73635fb9fd918479e49197028103f67da7ed33002bbffe05da3a4ec4523" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.970911 4722 scope.go:117] "RemoveContainer" containerID="1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.077514 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" path="/var/lib/kubelet/pods/0c9d3632-a132-4377-95ef-564cffb1f299/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.079428 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" path="/var/lib/kubelet/pods/2bb14baa-8bfc-415a-aa95-50b79f3c75ea/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.080190 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" path="/var/lib/kubelet/pods/396bbbdf-7f78-48e7-b02c-0737c221aaa6/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.082824 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" path="/var/lib/kubelet/pods/cb6886b7-9193-4c89-96c8-64b61c3251a4/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.083380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" path="/var/lib/kubelet/pods/f10dae1c-d938-4cce-893b-4ad7eca7d23f/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.419466 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwrjw"] Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.419847 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.419915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.419967 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420019 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420066 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420110 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420172 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420228 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420277 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420320 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420369 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420414 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420511 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420559 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420602 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420651 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420695 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420769 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420936 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.421008 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421091 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.421178 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421230 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.421340 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421391 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421588 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421734 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421783 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421849 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421895 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.422032 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.422095 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.423519 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.425570 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.430783 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwrjw"] Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.460972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vhl\" (UniqueName: \"kubernetes.io/projected/7a6ec43d-cefe-40ee-b41e-81dc96b88739-kube-api-access-z7vhl\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.461036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-utilities\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.461069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-catalog-content\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vhl\" (UniqueName: \"kubernetes.io/projected/7a6ec43d-cefe-40ee-b41e-81dc96b88739-kube-api-access-z7vhl\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-utilities\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-catalog-content\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562978 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-catalog-content\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.563104 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-utilities\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.581012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vhl\" (UniqueName: \"kubernetes.io/projected/7a6ec43d-cefe-40ee-b41e-81dc96b88739-kube-api-access-z7vhl\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.620421 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2l4s"] Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.621655 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.623931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.634202 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2l4s"] Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.663539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-catalog-content\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.663802 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rn8\" (UniqueName: \"kubernetes.io/projected/19cd1ff4-6442-47bc-8c68-679c1c19abce-kube-api-access-25rn8\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.663886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-utilities\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.740796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.764866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-catalog-content\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.764920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25rn8\" (UniqueName: \"kubernetes.io/projected/19cd1ff4-6442-47bc-8c68-679c1c19abce-kube-api-access-25rn8\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.764940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-utilities\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.766412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-catalog-content\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.766495 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-utilities\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.783335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rn8\" (UniqueName: \"kubernetes.io/projected/19cd1ff4-6442-47bc-8c68-679c1c19abce-kube-api-access-25rn8\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.788006 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.798514 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.798832 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.937938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.084668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2l4s"] Feb 19 19:24:42 crc kubenswrapper[4722]: W0219 19:24:42.088091 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cd1ff4_6442_47bc_8c68_679c1c19abce.slice/crio-cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3 WatchSource:0}: Error finding container cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3: Status 404 returned error can't find the container with id cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.153000 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwrjw"] Feb 19 19:24:42 crc kubenswrapper[4722]: W0219 19:24:42.155459 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6ec43d_cefe_40ee_b41e_81dc96b88739.slice/crio-5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5 WatchSource:0}: Error finding container 5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5: Status 404 returned error can't find the container with id 5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.791888 4722 generic.go:334] "Generic (PLEG): container finished" podID="19cd1ff4-6442-47bc-8c68-679c1c19abce" containerID="6ec4885ad6409af62960f29e754821fb63e4679cc0c42272bb54d2c421104548" exitCode=0 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.791938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerDied","Data":"6ec4885ad6409af62960f29e754821fb63e4679cc0c42272bb54d2c421104548"} Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.791986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerStarted","Data":"cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3"} Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.795709 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a6ec43d-cefe-40ee-b41e-81dc96b88739" containerID="427e0d3de2b7d9f99099ae26aec2524de06631a7e6bc9daaf68410c29a5ab986" exitCode=0 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.795795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerDied","Data":"427e0d3de2b7d9f99099ae26aec2524de06631a7e6bc9daaf68410c29a5ab986"} Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.795873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerStarted","Data":"5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5"} Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.802207 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a6ec43d-cefe-40ee-b41e-81dc96b88739" containerID="d2613d4ca6b645c54940dcb203608277edb114bdea813d90e87d749173b71b1f" exitCode=0 Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.802292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerDied","Data":"d2613d4ca6b645c54940dcb203608277edb114bdea813d90e87d749173b71b1f"} Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.806913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerStarted","Data":"e9dc8c766c83abf6467a10db880aa2e1401ae3aaf2dbf53939376ab83fb22261"} Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.814117 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpzr"] Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.815363 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.816979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.824398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpzr"] Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.995021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-utilities\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.995101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9x8\" (UniqueName: \"kubernetes.io/projected/277ec436-8032-4711-8573-5b2eaab8f212-kube-api-access-2d9x8\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.995246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-catalog-content\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.020010 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tr77s"] Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.021206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.023117 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.032375 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr77s"] Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-catalog-content\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-utilities\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw87j\" (UniqueName: \"kubernetes.io/projected/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-kube-api-access-zw87j\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9x8\" (UniqueName: \"kubernetes.io/projected/277ec436-8032-4711-8573-5b2eaab8f212-kube-api-access-2d9x8\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-catalog-content\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-utilities\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-catalog-content\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-utilities\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.116220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9x8\" (UniqueName: \"kubernetes.io/projected/277ec436-8032-4711-8573-5b2eaab8f212-kube-api-access-2d9x8\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.141065 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.197825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw87j\" (UniqueName: \"kubernetes.io/projected/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-kube-api-access-zw87j\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.197898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-utilities\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.197935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-catalog-content\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.198385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-catalog-content\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.198587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-utilities\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.222936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw87j\" (UniqueName: \"kubernetes.io/projected/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-kube-api-access-zw87j\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.329703 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpzr"] Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.369866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.555969 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr77s"] Feb 19 19:24:44 crc kubenswrapper[4722]: W0219 19:24:44.577679 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda704e2d3_bed1_47a6_a2d1_af2c3583e06c.slice/crio-61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c WatchSource:0}: Error finding container 61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c: Status 404 returned error can't find the container with id 61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.813814 4722 generic.go:334] "Generic (PLEG): container finished" podID="277ec436-8032-4711-8573-5b2eaab8f212" containerID="8c0260ca0f7f0558a17eda8de6ad6a4bd513d7b35c8a2898c91107ecdd18a35a" exitCode=0 Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.813928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerDied","Data":"8c0260ca0f7f0558a17eda8de6ad6a4bd513d7b35c8a2898c91107ecdd18a35a"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.813968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerStarted","Data":"47f928d595f73ddfa8a4368ef0aa8ba0590b597c5f910a8dd66bc9eae2da0931"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.816956 4722 generic.go:334] "Generic (PLEG): container finished" podID="19cd1ff4-6442-47bc-8c68-679c1c19abce" containerID="e9dc8c766c83abf6467a10db880aa2e1401ae3aaf2dbf53939376ab83fb22261" exitCode=0 Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.817009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerDied","Data":"e9dc8c766c83abf6467a10db880aa2e1401ae3aaf2dbf53939376ab83fb22261"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.819688 4722 generic.go:334] "Generic (PLEG): container finished" podID="a704e2d3-bed1-47a6-a2d1-af2c3583e06c" containerID="bf6d642b3d56af7be565e8c58abeae3b7dcd307a2e1180c940896739bdc1908c" exitCode=0 Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.819752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerDied","Data":"bf6d642b3d56af7be565e8c58abeae3b7dcd307a2e1180c940896739bdc1908c"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.819773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerStarted","Data":"61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.823538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerStarted","Data":"ebf0ed52b31c6e468bcd90184c245b728e44bc4262736fa3b1379a35674a2720"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.884213 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwrjw" podStartSLOduration=2.439384419 podStartE2EDuration="3.884187748s" podCreationTimestamp="2026-02-19 19:24:41 +0000 UTC" firstStartedPulling="2026-02-19 19:24:42.797976337 +0000 UTC m=+382.410326661" lastFinishedPulling="2026-02-19 19:24:44.242779666 +0000 UTC m=+383.855129990" observedRunningTime="2026-02-19 19:24:44.883114634 +0000 UTC m=+384.495464958" watchObservedRunningTime="2026-02-19 19:24:44.884187748 +0000 UTC m=+384.496538082" Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.829415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerStarted","Data":"63efecbfdbce9ed69369473ff3c9ea7e712f6279d7c1b8296d2b5455bf799f93"} Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.831195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerStarted","Data":"2cbef89681851c35b3099021254ae070e04a5a0e215e89780d4ea53d5f2c05be"} Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.832656 4722 generic.go:334] "Generic (PLEG): container finished" podID="277ec436-8032-4711-8573-5b2eaab8f212" containerID="0cbf7ecd5e0f5b3dbc145a34bb9d664402290a42de9d6983be300359ba8d4274" exitCode=0 Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.832721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerDied","Data":"0cbf7ecd5e0f5b3dbc145a34bb9d664402290a42de9d6983be300359ba8d4274"} Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.847410 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2l4s" podStartSLOduration=2.325751633 podStartE2EDuration="4.847391414s" podCreationTimestamp="2026-02-19 19:24:41 +0000 UTC" firstStartedPulling="2026-02-19 19:24:42.793426726 +0000 UTC m=+382.405777050" lastFinishedPulling="2026-02-19 19:24:45.315066507 +0000 UTC m=+384.927416831" observedRunningTime="2026-02-19 19:24:45.84403018 +0000 UTC m=+385.456380504" watchObservedRunningTime="2026-02-19 19:24:45.847391414 +0000 UTC m=+385.459741738" Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.846727 4722 generic.go:334] "Generic (PLEG): container finished" podID="a704e2d3-bed1-47a6-a2d1-af2c3583e06c" containerID="2cbef89681851c35b3099021254ae070e04a5a0e215e89780d4ea53d5f2c05be" exitCode=0 Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.846827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerDied","Data":"2cbef89681851c35b3099021254ae070e04a5a0e215e89780d4ea53d5f2c05be"} Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.859654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerStarted","Data":"868af1b6814a812654e403eafe22dece8d3939db37b70dad98e25cd53360d0ca"} Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.883822 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xhpzr" podStartSLOduration=2.421865899 podStartE2EDuration="3.883801347s" podCreationTimestamp="2026-02-19 19:24:43 +0000 UTC" firstStartedPulling="2026-02-19 19:24:44.815368535 +0000 UTC m=+384.427718859" lastFinishedPulling="2026-02-19 19:24:46.277303993 +0000 UTC m=+385.889654307" observedRunningTime="2026-02-19 19:24:46.880532187 +0000 UTC m=+386.492882511" watchObservedRunningTime="2026-02-19 19:24:46.883801347 +0000 UTC m=+386.496151681" Feb 19 19:24:47 crc kubenswrapper[4722]: I0219 19:24:47.866786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerStarted","Data":"2cad19f1c16d165313021fd5d803db4280cb827de7bc5fe079236904879e4326"} Feb 19 19:24:47 crc kubenswrapper[4722]: I0219 19:24:47.889424 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tr77s" podStartSLOduration=1.4606881170000001 podStartE2EDuration="3.889404741s" podCreationTimestamp="2026-02-19 19:24:44 +0000 UTC" firstStartedPulling="2026-02-19 19:24:44.822121303 +0000 UTC m=+384.434471647" lastFinishedPulling="2026-02-19 19:24:47.250837947 +0000 UTC m=+386.863188271" observedRunningTime="2026-02-19 19:24:47.88676159 +0000 UTC m=+387.499111924" watchObservedRunningTime="2026-02-19 19:24:47.889404741 +0000 UTC m=+387.501755065" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.740931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.741653 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.790605 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.922690 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.938680 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.938746 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.981915 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:52 crc kubenswrapper[4722]: I0219 19:24:52.923451 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.142197 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.142723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.198746 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.370202 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.370262 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.416460 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.944245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.946856 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:25:03 crc kubenswrapper[4722]: I0219 19:25:03.753764 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" containerID="cri-o://b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8" gracePeriod=30 Feb 19 19:25:03 crc kubenswrapper[4722]: I0219 19:25:03.948049 4722 generic.go:334] "Generic (PLEG): container finished" podID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerID="b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8" exitCode=0 Feb 19 19:25:03 crc kubenswrapper[4722]: I0219 19:25:03.948099 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerDied","Data":"b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8"} Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.150606 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173459 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.175524 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.175536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.187566 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz" (OuterVolumeSpecName: "kube-api-access-d2csz") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "kube-api-access-d2csz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.189357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.192648 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.199284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.199671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.200487 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274341 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274382 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274394 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274404 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274415 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274422 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274430 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.956250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerDied","Data":"7fc589c7d609f9f8ea97795796aadbb293f365ba97d8b385ba4c6ea2f33eb413"} Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.956334 4722 scope.go:117] "RemoveContainer" containerID="b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.956332 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:25:05 crc kubenswrapper[4722]: I0219 19:25:05.013012 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:25:05 crc kubenswrapper[4722]: I0219 19:25:05.019328 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:25:05 crc kubenswrapper[4722]: I0219 19:25:05.076529 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" path="/var/lib/kubelet/pods/8d31d88d-2e34-4b55-b843-b8a67b957680/volumes" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.798368 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.798707 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.798747 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.799383 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.799438 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312" gracePeriod=600 Feb 19 19:25:12 crc kubenswrapper[4722]: I0219 19:25:12.008204 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312" exitCode=0 Feb 19 19:25:12 crc kubenswrapper[4722]: I0219 19:25:12.008300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312"} Feb 19 19:25:12 crc kubenswrapper[4722]: I0219 19:25:12.008485 4722 scope.go:117] "RemoveContainer" containerID="dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d" Feb 19 19:25:13 crc kubenswrapper[4722]: I0219 19:25:13.018640 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c"} Feb 19 19:27:41 crc kubenswrapper[4722]: I0219 19:27:41.798429 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:27:41 crc kubenswrapper[4722]: I0219 19:27:41.799358 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:28:11 crc kubenswrapper[4722]: I0219 19:28:11.798778 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:28:11 crc kubenswrapper[4722]: I0219 19:28:11.799499 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.798100 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.798807 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.798878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.799681 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.799762 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c" gracePeriod=600 Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.941823 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c" exitCode=0 Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.941891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c"} Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.942433 4722 scope.go:117] "RemoveContainer" containerID="793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312" Feb 19 19:28:42 crc kubenswrapper[4722]: I0219 19:28:42.952394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84"} Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.492787 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m"] Feb 19 19:29:33 crc kubenswrapper[4722]: E0219 19:29:33.493590 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.493604 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.493721 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.494543 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.496439 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.509451 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m"] Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.656396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.656494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.656581 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.757606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.757715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.757804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.758292 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.758468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.780277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.812331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:34 crc kubenswrapper[4722]: I0219 19:29:34.208584 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m"] Feb 19 19:29:34 crc kubenswrapper[4722]: I0219 19:29:34.266912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerStarted","Data":"1308967f71d0da97d8a17acc24d6fc36d84d34da1a47983f745a8e51d759a0d1"} Feb 19 19:29:35 crc kubenswrapper[4722]: I0219 19:29:35.271896 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerID="05a15e1754c52e72e4ad935cf4ccc48bc64ecd37b25378cce07f5da46eaa3ad8" exitCode=0 Feb 19 19:29:35 crc kubenswrapper[4722]: I0219 19:29:35.271998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"05a15e1754c52e72e4ad935cf4ccc48bc64ecd37b25378cce07f5da46eaa3ad8"} Feb 19 19:29:35 crc kubenswrapper[4722]: I0219 19:29:35.273202 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:29:37 crc kubenswrapper[4722]: I0219 19:29:37.285999 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerID="bc043e9398d53a46c65eb4ada63ed6b08dc67bdc0122b2f97f793a5d9f61963c" exitCode=0 Feb 19 19:29:37 crc kubenswrapper[4722]: I0219 19:29:37.286116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"bc043e9398d53a46c65eb4ada63ed6b08dc67bdc0122b2f97f793a5d9f61963c"} Feb 19 19:29:38 crc kubenswrapper[4722]: I0219 19:29:38.292845 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerID="ab5dc2d8e8e2ea2125bcadf938b8bd230c98a434bfab23a5895f1114319d2456" exitCode=0 Feb 19 19:29:38 crc kubenswrapper[4722]: I0219 19:29:38.292891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"ab5dc2d8e8e2ea2125bcadf938b8bd230c98a434bfab23a5895f1114319d2456"} Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.530624 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.635388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.635454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.635518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.640055 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle" (OuterVolumeSpecName: "bundle") pod "ae31c080-c2a8-484e-9d6a-bd55ca4ae533" (UID: "ae31c080-c2a8-484e-9d6a-bd55ca4ae533"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.647026 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq" (OuterVolumeSpecName: "kube-api-access-847tq") pod "ae31c080-c2a8-484e-9d6a-bd55ca4ae533" (UID: "ae31c080-c2a8-484e-9d6a-bd55ca4ae533"). InnerVolumeSpecName "kube-api-access-847tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.648280 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.648376 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.717613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util" (OuterVolumeSpecName: "util") pod "ae31c080-c2a8-484e-9d6a-bd55ca4ae533" (UID: "ae31c080-c2a8-484e-9d6a-bd55ca4ae533"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.749672 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:40 crc kubenswrapper[4722]: I0219 19:29:40.305486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"1308967f71d0da97d8a17acc24d6fc36d84d34da1a47983f745a8e51d759a0d1"} Feb 19 19:29:40 crc kubenswrapper[4722]: I0219 19:29:40.305541 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1308967f71d0da97d8a17acc24d6fc36d84d34da1a47983f745a8e51d759a0d1" Feb 19 19:29:40 crc kubenswrapper[4722]: I0219 19:29:40.305562 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.858497 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859112 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" containerID="cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859173 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" containerID="cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859231 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" containerID="cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859236 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" containerID="cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859286 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859284 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" containerID="cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859233 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" containerID="cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.892013 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" containerID="cri-o://c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" gracePeriod=30 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.225087 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.228100 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-acl-logging/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.228802 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-controller/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.229380 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321126 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321305 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321313 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321322 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321337 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321495 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket" (OuterVolumeSpecName: "log-socket") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321547 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321544 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log" (OuterVolumeSpecName: "node-log") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321689 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash" (OuterVolumeSpecName: "host-slash") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321848 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322116 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322143 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322177 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322190 4722 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322207 4722 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322221 4722 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322233 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322245 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322257 4722 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322268 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322306 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322319 4722 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322329 4722 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322339 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322349 4722 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322362 4722 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322372 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322383 4722 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.332347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p" (OuterVolumeSpecName: "kube-api-access-zjr2p") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "kube-api-access-zjr2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.334464 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.338688 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.342659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.350214 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-acl-logging/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.350659 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-controller/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.350978 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351001 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351009 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351016 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351022 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351029 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351036 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" exitCode=143 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351043 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" exitCode=143 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351162 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351173 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351183 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351188 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351195 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351202 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351209 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351215 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351223 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351228 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351243 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351248 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351254 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351259 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351264 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351269 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351274 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351279 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351284 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351288 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351296 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351305 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351311 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351318 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351323 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351328 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351333 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351338 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351343 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351348 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351353 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351360 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351367 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351372 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351379 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351384 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351389 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351394 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351399 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351405 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351410 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351415 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351429 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351563 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359083 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmh6g"] Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359286 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359298 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359308 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359315 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359323 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359329 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359337 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359342 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359350 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359356 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359363 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359370 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359380 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="util" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359386 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="util" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359403 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359412 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kubecfg-setup" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359418 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kubecfg-setup" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359423 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359429 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359439 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359445 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359452 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="extract" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359458 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="extract" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359473 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359482 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="pull" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359487 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="pull" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359494 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359499 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359589 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359598 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359606 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359615 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="extract" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359623 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359632 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359640 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359646 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359654 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359662 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359669 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359751 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359757 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359837 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.364459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.372870 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.373037 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/2.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.373056 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.373126 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374472 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374514 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" exitCode=2 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374544 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerDied","Data":"1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374564 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374955 4722 scope.go:117] "RemoveContainer" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.375124 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jnvgg_openshift-multus(7a80fcd7-8ac4-4e82-8f14-93d225898bb5)\"" pod="openshift-multus/multus-jnvgg" podUID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.379654 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.403385 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.423359 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424384 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-config\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-netns\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-slash\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-systemd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424522 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-systemd-units\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-node-log\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-bin\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-etc-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-var-lib-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424691 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-env-overrides\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-log-socket\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424745 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424763 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6h9\" (UniqueName: \"kubernetes.io/projected/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-kube-api-access-hr6h9\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-kubelet\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-ovn\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-netd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-script-lib\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424865 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424876 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424885 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424894 4722 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424903 4722 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.441400 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.459323 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.465206 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.470441 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.485974 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.500708 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.514753 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-var-lib-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525809 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-env-overrides\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-log-socket\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-var-lib-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-log-socket\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6h9\" (UniqueName: \"kubernetes.io/projected/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-kube-api-access-hr6h9\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-kubelet\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-ovn\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526096 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-netd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-script-lib\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-config\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-kubelet\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526168 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-netns\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-slash\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526188 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-netd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-systemd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526222 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-systemd-units\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-node-log\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-bin\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526284 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-etc-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-etc-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-slash\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526221 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-ovn\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526383 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-node-log\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-bin\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526449 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-systemd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-systemd-units\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-env-overrides\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-netns\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-script-lib\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.529245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.529722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-config\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.530793 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.549709 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.550122 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550179 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550212 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.550652 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550693 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550718 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.551209 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551238 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551258 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.551533 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551616 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551699 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6h9\" (UniqueName: \"kubernetes.io/projected/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-kube-api-access-hr6h9\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.551882 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551915 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551938 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552140 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552196 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552214 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552421 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552472 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552707 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552739 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552757 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552972 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552995 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553008 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.553191 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553214 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553227 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553406 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553423 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553616 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553645 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553804 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553822 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553975 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553990 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554143 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554179 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554577 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554618 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554933 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554950 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555234 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555276 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555564 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555580 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555847 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555865 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556192 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556233 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556538 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556556 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556755 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556780 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556947 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556962 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557115 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557130 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557433 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557464 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557677 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557705 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557899 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557926 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558129 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558170 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558434 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558461 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558716 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558741 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558962 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558984 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559274 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559298 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559590 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559623 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559820 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559838 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560029 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560054 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560287 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560307 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560521 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560548 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560764 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560793 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.561009 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.561031 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.561294 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.690711 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: W0219 19:29:45.707613 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b2b829e_4254_4c5c_a130_ed72dcc47cc7.slice/crio-3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b WatchSource:0}: Error finding container 3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b: Status 404 returned error can't find the container with id 3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b Feb 19 19:29:46 crc kubenswrapper[4722]: I0219 19:29:46.380616 4722 generic.go:334] "Generic (PLEG): container finished" podID="1b2b829e-4254-4c5c-a130-ed72dcc47cc7" containerID="4bf185214f3644a24e32df5d310671d0683ebc3edcb6d28dd643c8846c67184e" exitCode=0 Feb 19 19:29:46 crc kubenswrapper[4722]: I0219 19:29:46.380651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerDied","Data":"4bf185214f3644a24e32df5d310671d0683ebc3edcb6d28dd643c8846c67184e"} Feb 19 19:29:46 crc kubenswrapper[4722]: I0219 19:29:46.380670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.079587 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" path="/var/lib/kubelet/pods/5eb7c404-f96e-43a7-b20f-b45d856c75a5/volumes" Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.387869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"1d098cfdd051c75593d774b27da82684e58c737e122ab035584717c4f33a7c37"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"6a9ba0f35c9d2704c8f98b593133e289ee645378a869f3082e2c0a01f8c2ef46"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388137 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"69d72ac00e6a4b8153c6f4a81456f2b8a6fc62140a5cb35d5ba12ff3fb99cd7c"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"c8da5603c232c8713e9df3dc78a1c2b636b33a23b756a20515b71b2567ab73a9"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388168 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"f19279e6e8b313dcdd1cbe3dcdda50dd7a9766155963895e14134ea5a6cc399e"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388179 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"62d4788570ec88e00e4f2f93a88ca63d605383e3f28ef63c1727f7dd170d1b4f"} Feb 19 19:29:50 crc kubenswrapper[4722]: I0219 19:29:50.403632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"a87638205fc09488df3965e599b074028e0c991c41692673bd7cc58945dd41ce"} Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.056337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.056922 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.059143 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.059181 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-ks2vz" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.059507 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.113296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z55j\" (UniqueName: \"kubernetes.io/projected/572e9436-e389-4b1e-b86f-e13f14f8d3eb-kube-api-access-8z55j\") pod \"obo-prometheus-operator-68bc856cb9-v7lzn\" (UID: \"572e9436-e389-4b1e-b86f-e13f14f8d3eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.183413 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.184046 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.185924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qvqrv" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.186359 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.190906 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.191534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z55j\" (UniqueName: \"kubernetes.io/projected/572e9436-e389-4b1e-b86f-e13f14f8d3eb-kube-api-access-8z55j\") pod \"obo-prometheus-operator-68bc856cb9-v7lzn\" (UID: \"572e9436-e389-4b1e-b86f-e13f14f8d3eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.246544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z55j\" (UniqueName: \"kubernetes.io/projected/572e9436-e389-4b1e-b86f-e13f14f8d3eb-kube-api-access-8z55j\") pod \"obo-prometheus-operator-68bc856cb9-v7lzn\" (UID: \"572e9436-e389-4b1e-b86f-e13f14f8d3eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.315886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.315994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.316044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.316116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.322353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.322625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.333714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.333809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.372726 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.392616 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8xtkk"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.393785 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.395701 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mzsjk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.395927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401756 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401814 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401847 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401890 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podUID="572e9436-e389-4b1e-b86f-e13f14f8d3eb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.416888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/68e6d18b-f149-46fb-ba46-8fb37d82712a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.416949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svlc\" (UniqueName: \"kubernetes.io/projected/68e6d18b-f149-46fb-ba46-8fb37d82712a-kube-api-access-8svlc\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.500592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.508474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.517525 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/68e6d18b-f149-46fb-ba46-8fb37d82712a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.517577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8svlc\" (UniqueName: \"kubernetes.io/projected/68e6d18b-f149-46fb-ba46-8fb37d82712a-kube-api-access-8svlc\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526444 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526524 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526554 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526614 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podUID="1577ee2f-abd8-4e61-9fd1-238960e8bdf6" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.528668 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/68e6d18b-f149-46fb-ba46-8fb37d82712a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.538779 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svlc\" (UniqueName: \"kubernetes.io/projected/68e6d18b-f149-46fb-ba46-8fb37d82712a-kube-api-access-8svlc\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540346 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540429 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540452 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540496 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podUID="cc8f56cb-a9d1-4b27-adca-40adf6902cc8" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.597740 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4qpbt"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.598622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.600478 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gvb9x" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.619859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fnfx\" (UniqueName: \"kubernetes.io/projected/7f659845-54cc-4e5c-892c-a754900c1f39-kube-api-access-7fnfx\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.619908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f659845-54cc-4e5c-892c-a754900c1f39-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.721312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fnfx\" (UniqueName: \"kubernetes.io/projected/7f659845-54cc-4e5c-892c-a754900c1f39-kube-api-access-7fnfx\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.721367 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f659845-54cc-4e5c-892c-a754900c1f39-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.722258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f659845-54cc-4e5c-892c-a754900c1f39-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.747173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fnfx\" (UniqueName: \"kubernetes.io/projected/7f659845-54cc-4e5c-892c-a754900c1f39-kube-api-access-7fnfx\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.750897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789471 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789549 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789570 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789621 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.913458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938812 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938892 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938923 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938979 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podUID="7f659845-54cc-4e5c-892c-a754900c1f39" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.417422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"6116fe360f91a9c7c2b769555ed6c7f2c97cfbaa39edbdb83476ab624f25d005"} Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.417723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.417777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.447886 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.450909 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" podStartSLOduration=7.450892913 podStartE2EDuration="7.450892913s" podCreationTimestamp="2026-02-19 19:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:29:52.447179457 +0000 UTC m=+692.059529781" watchObservedRunningTime="2026-02-19 19:29:52.450892913 +0000 UTC m=+692.063243237" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.516093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.516242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.516719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.521874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.522033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.522522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.526720 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.526844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.527372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.534136 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4qpbt"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.534315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.534691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.537597 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8xtkk"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.537679 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.537983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597347 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597418 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597441 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podUID="572e9436-e389-4b1e-b86f-e13f14f8d3eb" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602464 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602527 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602546 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podUID="1577ee2f-abd8-4e61-9fd1-238960e8bdf6" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619540 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619598 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619621 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619665 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podUID="cc8f56cb-a9d1-4b27-adca-40adf6902cc8" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625104 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625187 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625205 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625237 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633263 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633326 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633347 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633393 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podUID="7f659845-54cc-4e5c-892c-a754900c1f39" Feb 19 19:29:53 crc kubenswrapper[4722]: I0219 19:29:53.436914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:53 crc kubenswrapper[4722]: I0219 19:29:53.461928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:58 crc kubenswrapper[4722]: I0219 19:29:58.071129 4722 scope.go:117] "RemoveContainer" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" Feb 19 19:29:58 crc kubenswrapper[4722]: E0219 19:29:58.072080 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jnvgg_openshift-multus(7a80fcd7-8ac4-4e82-8f14-93d225898bb5)\"" pod="openshift-multus/multus-jnvgg" podUID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.165535 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.166394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.170691 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.171024 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.180723 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.320624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.320951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.320999 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.422145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.422346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.422379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.423520 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.442875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.468303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.490301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517607 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517732 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517805 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517905 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" Feb 19 19:30:01 crc kubenswrapper[4722]: I0219 19:30:01.479715 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: I0219 19:30:01.480133 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500742 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500813 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500845 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500904 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.071368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.071395 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.072886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.073188 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116680 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116759 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116795 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116843 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podUID="7f659845-54cc-4e5c-892c-a754900c1f39" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123355 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123416 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123436 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123479 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podUID="cc8f56cb-a9d1-4b27-adca-40adf6902cc8" Feb 19 19:30:06 crc kubenswrapper[4722]: I0219 19:30:06.070921 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: I0219 19:30:06.071644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092191 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092257 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092277 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092320 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" Feb 19 19:30:07 crc kubenswrapper[4722]: I0219 19:30:07.071297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: I0219 19:30:07.071792 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100612 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100686 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100722 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100767 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podUID="572e9436-e389-4b1e-b86f-e13f14f8d3eb" Feb 19 19:30:08 crc kubenswrapper[4722]: I0219 19:30:08.071261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: I0219 19:30:08.071778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.103222 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.104177 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.104310 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.104454 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podUID="1577ee2f-abd8-4e61-9fd1-238960e8bdf6" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.070870 4722 scope.go:117] "RemoveContainer" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.535552 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/2.log" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.536751 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.536826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"c50852e5b77d05de6aa6ecb4533ea82b9f06d7b4cf8cb98687ee0f15fe36d8dc"} Feb 19 19:30:15 crc kubenswrapper[4722]: I0219 19:30:15.711369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.489258 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb"] Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.492688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.564350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" event={"ID":"cc8f56cb-a9d1-4b27-adca-40adf6902cc8","Type":"ContainerStarted","Data":"78f276ac129300e83d241e7681abee219295215443de74bfb3aa4ae2e9b7e9da"} Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.565488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerStarted","Data":"16269b9bb52f2154f76e0376730fd1345c43b7c76b17bb53cb71d2835a369a19"} Feb 19 19:30:18 crc kubenswrapper[4722]: I0219 19:30:18.574924 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerStarted","Data":"1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe"} Feb 19 19:30:18 crc kubenswrapper[4722]: I0219 19:30:18.593589 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" podStartSLOduration=18.593556372 podStartE2EDuration="18.593556372s" podCreationTimestamp="2026-02-19 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:30:18.590798966 +0000 UTC m=+718.203149320" watchObservedRunningTime="2026-02-19 19:30:18.593556372 +0000 UTC m=+718.205906726" Feb 19 19:30:19 crc kubenswrapper[4722]: I0219 19:30:19.583250 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerID="1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe" exitCode=0 Feb 19 19:30:19 crc kubenswrapper[4722]: I0219 19:30:19.583310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerDied","Data":"1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe"} Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.070205 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.070290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.071051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.071272 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.070846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.075389 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.269783 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.333408 4722 scope.go:117] "RemoveContainer" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.398575 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"b6513190-cf4a-405f-a7ca-c35f37d63725\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.399050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"b6513190-cf4a-405f-a7ca-c35f37d63725\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.399088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"b6513190-cf4a-405f-a7ca-c35f37d63725\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.400483 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6513190-cf4a-405f-a7ca-c35f37d63725" (UID: "b6513190-cf4a-405f-a7ca-c35f37d63725"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.400741 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.406237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6513190-cf4a-405f-a7ca-c35f37d63725" (UID: "b6513190-cf4a-405f-a7ca-c35f37d63725"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.406326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt" (OuterVolumeSpecName: "kube-api-access-cs9pt") pod "b6513190-cf4a-405f-a7ca-c35f37d63725" (UID: "b6513190-cf4a-405f-a7ca-c35f37d63725"). InnerVolumeSpecName "kube-api-access-cs9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.501542 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.501573 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.600033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerDied","Data":"16269b9bb52f2154f76e0376730fd1345c43b7c76b17bb53cb71d2835a369a19"} Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.600072 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16269b9bb52f2154f76e0376730fd1345c43b7c76b17bb53cb71d2835a369a19" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.600124 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.607634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" event={"ID":"cc8f56cb-a9d1-4b27-adca-40adf6902cc8","Type":"ContainerStarted","Data":"82558eb0c28911811b707e4a99b732be0a01d144dafe7608a10ed064d3554ef4"} Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.610637 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/2.log" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.611366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8xtkk"] Feb 19 19:30:21 crc kubenswrapper[4722]: W0219 19:30:21.615589 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68e6d18b_f149_46fb_ba46_8fb37d82712a.slice/crio-e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74 WatchSource:0}: Error finding container e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74: Status 404 returned error can't find the container with id e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74 Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.644132 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podStartSLOduration=26.851998416 podStartE2EDuration="30.644100392s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:17.503587668 +0000 UTC m=+717.115937992" lastFinishedPulling="2026-02-19 19:30:21.295689634 +0000 UTC m=+720.908039968" observedRunningTime="2026-02-19 19:30:21.63597583 +0000 UTC m=+721.248326164" watchObservedRunningTime="2026-02-19 19:30:21.644100392 +0000 UTC m=+721.256450736" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.695804 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn"] Feb 19 19:30:21 crc kubenswrapper[4722]: W0219 19:30:21.701339 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572e9436_e389_4b1e_b86f_e13f14f8d3eb.slice/crio-5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02 WatchSource:0}: Error finding container 5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02: Status 404 returned error can't find the container with id 5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02 Feb 19 19:30:21 crc kubenswrapper[4722]: W0219 19:30:21.702340 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f659845_54cc_4e5c_892c_a754900c1f39.slice/crio-38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265 WatchSource:0}: Error finding container 38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265: Status 404 returned error can't find the container with id 38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265 Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.705095 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4qpbt"] Feb 19 19:30:22 crc kubenswrapper[4722]: I0219 19:30:22.630574 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" event={"ID":"7f659845-54cc-4e5c-892c-a754900c1f39","Type":"ContainerStarted","Data":"38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265"} Feb 19 19:30:22 crc kubenswrapper[4722]: I0219 19:30:22.633908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" event={"ID":"68e6d18b-f149-46fb-ba46-8fb37d82712a","Type":"ContainerStarted","Data":"e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74"} Feb 19 19:30:22 crc kubenswrapper[4722]: I0219 19:30:22.636663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" event={"ID":"572e9436-e389-4b1e-b86f-e13f14f8d3eb","Type":"ContainerStarted","Data":"5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02"} Feb 19 19:30:23 crc kubenswrapper[4722]: I0219 19:30:23.071390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:23 crc kubenswrapper[4722]: I0219 19:30:23.071839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:23 crc kubenswrapper[4722]: I0219 19:30:23.662401 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq"] Feb 19 19:30:23 crc kubenswrapper[4722]: W0219 19:30:23.927203 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1577ee2f_abd8_4e61_9fd1_238960e8bdf6.slice/crio-2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c WatchSource:0}: Error finding container 2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c: Status 404 returned error can't find the container with id 2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c Feb 19 19:30:24 crc kubenswrapper[4722]: I0219 19:30:24.651377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" event={"ID":"1577ee2f-abd8-4e61-9fd1-238960e8bdf6","Type":"ContainerStarted","Data":"2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.664650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" event={"ID":"7f659845-54cc-4e5c-892c-a754900c1f39","Type":"ContainerStarted","Data":"0bd5ee31c75a951c493b2c2ca4402b386531ee61a599ed5fc4021920db0ef246"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.665007 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.665898 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" event={"ID":"68e6d18b-f149-46fb-ba46-8fb37d82712a","Type":"ContainerStarted","Data":"5cf514f490e4904f3f000080860a51c110530446d7f150f694a82586d5ff7c5f"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.666362 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.667359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" event={"ID":"1577ee2f-abd8-4e61-9fd1-238960e8bdf6","Type":"ContainerStarted","Data":"c97c43d59f507122bdf9409502429cb4ae989ab87c9692e9bf196d9d27e6bb2c"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.669443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" event={"ID":"572e9436-e389-4b1e-b86f-e13f14f8d3eb","Type":"ContainerStarted","Data":"a6ccd71954305587c05c2fb66e8fdb36db4f19f68f0578f966deccff3695ed36"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.679846 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podStartSLOduration=31.329892945 podStartE2EDuration="35.679830553s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:21.705987928 +0000 UTC m=+721.318338252" lastFinishedPulling="2026-02-19 19:30:26.055925536 +0000 UTC m=+725.668275860" observedRunningTime="2026-02-19 19:30:26.678545552 +0000 UTC m=+726.290895886" watchObservedRunningTime="2026-02-19 19:30:26.679830553 +0000 UTC m=+726.292180877" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.701719 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podStartSLOduration=31.232645089 podStartE2EDuration="35.701698393s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:21.618385952 +0000 UTC m=+721.230736276" lastFinishedPulling="2026-02-19 19:30:26.087439256 +0000 UTC m=+725.699789580" observedRunningTime="2026-02-19 19:30:26.698329398 +0000 UTC m=+726.310679742" watchObservedRunningTime="2026-02-19 19:30:26.701698393 +0000 UTC m=+726.314048717" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.704671 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.721431 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podStartSLOduration=35.721412636 podStartE2EDuration="35.721412636s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:30:26.719622221 +0000 UTC m=+726.331972565" watchObservedRunningTime="2026-02-19 19:30:26.721412636 +0000 UTC m=+726.333762970" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.751471 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podStartSLOduration=31.405899627 podStartE2EDuration="35.75145103s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:21.704518121 +0000 UTC m=+721.316868455" lastFinishedPulling="2026-02-19 19:30:26.050069534 +0000 UTC m=+725.662419858" observedRunningTime="2026-02-19 19:30:26.746442965 +0000 UTC m=+726.358793289" watchObservedRunningTime="2026-02-19 19:30:26.75145103 +0000 UTC m=+726.363801354" Feb 19 19:30:31 crc kubenswrapper[4722]: I0219 19:30:31.916588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.865205 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-242s6"] Feb 19 19:30:37 crc kubenswrapper[4722]: E0219 19:30:37.865517 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerName="collect-profiles" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.865534 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerName="collect-profiles" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.865692 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerName="collect-profiles" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.866196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.872948 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fz7bp"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.873811 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.873934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.873960 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5qm5q" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.874045 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.879689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-lr45c" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.884124 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-242s6"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.887325 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fz7bp"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.902955 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hzrck"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.903595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.907175 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6qg99" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.920816 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hzrck"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.936378 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfvc\" (UniqueName: \"kubernetes.io/projected/9545d522-f459-4b98-ac7f-d107189b7497-kube-api-access-ngfvc\") pod \"cert-manager-858654f9db-fz7bp\" (UID: \"9545d522-f459-4b98-ac7f-d107189b7497\") " pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.038639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfvc\" (UniqueName: \"kubernetes.io/projected/9545d522-f459-4b98-ac7f-d107189b7497-kube-api-access-ngfvc\") pod \"cert-manager-858654f9db-fz7bp\" (UID: \"9545d522-f459-4b98-ac7f-d107189b7497\") " pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.038722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdcnl\" (UniqueName: \"kubernetes.io/projected/e49e50d8-05f3-42f4-a03a-f3a750e1a134-kube-api-access-kdcnl\") pod \"cert-manager-webhook-687f57d79b-hzrck\" (UID: \"e49e50d8-05f3-42f4-a03a-f3a750e1a134\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.038828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbft\" (UniqueName: \"kubernetes.io/projected/b1356eef-86bd-4fbf-beb6-a98cd8bc60b8-kube-api-access-6jbft\") pod \"cert-manager-cainjector-cf98fcc89-242s6\" (UID: \"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.056053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfvc\" (UniqueName: \"kubernetes.io/projected/9545d522-f459-4b98-ac7f-d107189b7497-kube-api-access-ngfvc\") pod \"cert-manager-858654f9db-fz7bp\" (UID: \"9545d522-f459-4b98-ac7f-d107189b7497\") " pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.140441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdcnl\" (UniqueName: \"kubernetes.io/projected/e49e50d8-05f3-42f4-a03a-f3a750e1a134-kube-api-access-kdcnl\") pod \"cert-manager-webhook-687f57d79b-hzrck\" (UID: \"e49e50d8-05f3-42f4-a03a-f3a750e1a134\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.140497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbft\" (UniqueName: \"kubernetes.io/projected/b1356eef-86bd-4fbf-beb6-a98cd8bc60b8-kube-api-access-6jbft\") pod \"cert-manager-cainjector-cf98fcc89-242s6\" (UID: \"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.158490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdcnl\" (UniqueName: \"kubernetes.io/projected/e49e50d8-05f3-42f4-a03a-f3a750e1a134-kube-api-access-kdcnl\") pod \"cert-manager-webhook-687f57d79b-hzrck\" (UID: \"e49e50d8-05f3-42f4-a03a-f3a750e1a134\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.158600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbft\" (UniqueName: \"kubernetes.io/projected/b1356eef-86bd-4fbf-beb6-a98cd8bc60b8-kube-api-access-6jbft\") pod \"cert-manager-cainjector-cf98fcc89-242s6\" (UID: \"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.189119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.199936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.222086 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.516012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-242s6"] Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.519065 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fz7bp"] Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.529699 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hzrck"] Feb 19 19:30:38 crc kubenswrapper[4722]: W0219 19:30:38.534461 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode49e50d8_05f3_42f4_a03a_f3a750e1a134.slice/crio-92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037 WatchSource:0}: Error finding container 92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037: Status 404 returned error can't find the container with id 92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037 Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.738717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fz7bp" event={"ID":"9545d522-f459-4b98-ac7f-d107189b7497","Type":"ContainerStarted","Data":"b67c9229671b56ba1322ed5dbbdf35f841cd1e67d76f661e68ebc4ef19ec5b05"} Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.740287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" event={"ID":"e49e50d8-05f3-42f4-a03a-f3a750e1a134","Type":"ContainerStarted","Data":"92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037"} Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.741512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" event={"ID":"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8","Type":"ContainerStarted","Data":"2fc37e4eeae0a989d8d4667e09ed7712db1c39c2e444c38a29d03e0b4c7409b1"} Feb 19 19:30:42 crc kubenswrapper[4722]: I0219 19:30:42.772126 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" event={"ID":"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8","Type":"ContainerStarted","Data":"5e3fa562f34cd8581894b729c28d1b5f25ce28e100318a0efe44f385cee0bae8"} Feb 19 19:30:42 crc kubenswrapper[4722]: I0219 19:30:42.795866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fz7bp" event={"ID":"9545d522-f459-4b98-ac7f-d107189b7497","Type":"ContainerStarted","Data":"48c1d84017602a58e21e5afdbfaab054fb62192b6c7ff091473bfb95d08326b1"} Feb 19 19:30:42 crc kubenswrapper[4722]: I0219 19:30:42.822802 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fz7bp" podStartSLOduration=1.899098623 podStartE2EDuration="5.822785793s" podCreationTimestamp="2026-02-19 19:30:37 +0000 UTC" firstStartedPulling="2026-02-19 19:30:38.53478162 +0000 UTC m=+738.147131944" lastFinishedPulling="2026-02-19 19:30:42.4584688 +0000 UTC m=+742.070819114" observedRunningTime="2026-02-19 19:30:42.822439342 +0000 UTC m=+742.434789666" watchObservedRunningTime="2026-02-19 19:30:42.822785793 +0000 UTC m=+742.435136117" Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.802543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" event={"ID":"e49e50d8-05f3-42f4-a03a-f3a750e1a134","Type":"ContainerStarted","Data":"0044e999c253f899989130bbc67c949975616f078a8a10a7e18cf186c724418e"} Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.802791 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.825638 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" podStartSLOduration=2.843743537 podStartE2EDuration="6.825613337s" podCreationTimestamp="2026-02-19 19:30:37 +0000 UTC" firstStartedPulling="2026-02-19 19:30:38.535506583 +0000 UTC m=+738.147856897" lastFinishedPulling="2026-02-19 19:30:42.517376373 +0000 UTC m=+742.129726697" observedRunningTime="2026-02-19 19:30:43.823617194 +0000 UTC m=+743.435967518" watchObservedRunningTime="2026-02-19 19:30:43.825613337 +0000 UTC m=+743.437963671" Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.851150 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" podStartSLOduration=2.91684871 podStartE2EDuration="6.851117739s" podCreationTimestamp="2026-02-19 19:30:37 +0000 UTC" firstStartedPulling="2026-02-19 19:30:38.527463383 +0000 UTC m=+738.139813697" lastFinishedPulling="2026-02-19 19:30:42.461732402 +0000 UTC m=+742.074082726" observedRunningTime="2026-02-19 19:30:43.841038376 +0000 UTC m=+743.453388710" watchObservedRunningTime="2026-02-19 19:30:43.851117739 +0000 UTC m=+743.463468103" Feb 19 19:30:48 crc kubenswrapper[4722]: I0219 19:30:48.225777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:55 crc kubenswrapper[4722]: I0219 19:30:55.721605 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.280033 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.282098 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.282232 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.437546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.437598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.437621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539793 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.540276 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.565979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.598739 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.799694 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.800058 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.813333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:11 crc kubenswrapper[4722]: W0219 19:31:11.846587 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac91b740_cc99_49ce_bda9_e209dfa22140.slice/crio-e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb WatchSource:0}: Error finding container e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb: Status 404 returned error can't find the container with id e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.989390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerStarted","Data":"e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb"} Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.702220 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2"] Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.703441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.706173 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.719816 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2"] Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.758281 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.758335 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.758392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859165 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.878785 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.996993 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerID="01d1ca82849ceb3bfa6f7f0cd551b947f9fd07718242afa579460d54fe5bc317" exitCode=0 Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.997039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"01d1ca82849ceb3bfa6f7f0cd551b947f9fd07718242afa579460d54fe5bc317"} Feb 19 19:31:13 crc kubenswrapper[4722]: I0219 19:31:13.015961 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:13 crc kubenswrapper[4722]: I0219 19:31:13.193135 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2"] Feb 19 19:31:13 crc kubenswrapper[4722]: W0219 19:31:13.207039 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f50f1aa_154d_409a_826d_c6c4b3c75559.slice/crio-e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4 WatchSource:0}: Error finding container e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4: Status 404 returned error can't find the container with id e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4 Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.009423 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerID="eb560083463bd32e81a3eb16d0b0a9b35ef46b2e728c36e011ce0b153ac00cb5" exitCode=0 Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.009482 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"eb560083463bd32e81a3eb16d0b0a9b35ef46b2e728c36e011ce0b153ac00cb5"} Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.009543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerStarted","Data":"e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4"} Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.013323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerStarted","Data":"8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad"} Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.995790 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.997761 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.999718 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.003984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.004196 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.026864 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerID="8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad" exitCode=0 Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.026938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad"} Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.191635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.191679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdzk\" (UniqueName: \"kubernetes.io/projected/7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3-kube-api-access-2qdzk\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.293078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.293124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdzk\" (UniqueName: \"kubernetes.io/projected/7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3-kube-api-access-2qdzk\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.296513 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.296550 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2c18b383742e5c06cff472d15916597dce353e89b4c5777862b9dc0b774bf042/globalmount\"" pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.316227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdzk\" (UniqueName: \"kubernetes.io/projected/7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3-kube-api-access-2qdzk\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.317198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.612688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.897686 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 19 19:31:15 crc kubenswrapper[4722]: W0219 19:31:15.906085 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f22aa66_46c7_4d3c_8a69_4d67e2dcaec3.slice/crio-2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111 WatchSource:0}: Error finding container 2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111: Status 404 returned error can't find the container with id 2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111 Feb 19 19:31:16 crc kubenswrapper[4722]: I0219 19:31:16.032500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3","Type":"ContainerStarted","Data":"2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111"} Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.043959 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerID="33a6f974d138e43146af7cf2af1e131660c57e0192bce7040ef9d5b0386220b3" exitCode=0 Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.044238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"33a6f974d138e43146af7cf2af1e131660c57e0192bce7040ef9d5b0386220b3"} Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.046975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerStarted","Data":"bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5"} Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.080808 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jss6p" podStartSLOduration=3.161405227 podStartE2EDuration="6.080794147s" podCreationTimestamp="2026-02-19 19:31:11 +0000 UTC" firstStartedPulling="2026-02-19 19:31:12.998533044 +0000 UTC m=+772.610883368" lastFinishedPulling="2026-02-19 19:31:15.917921954 +0000 UTC m=+775.530272288" observedRunningTime="2026-02-19 19:31:17.078708102 +0000 UTC m=+776.691058426" watchObservedRunningTime="2026-02-19 19:31:17.080794147 +0000 UTC m=+776.693144471" Feb 19 19:31:18 crc kubenswrapper[4722]: I0219 19:31:18.054046 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerID="7bd1b3484ee9fb3dff98e5657392c34e57d5cc3cb981bb43d8b31f6f90ed8f2d" exitCode=0 Feb 19 19:31:18 crc kubenswrapper[4722]: I0219 19:31:18.054390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"7bd1b3484ee9fb3dff98e5657392c34e57d5cc3cb981bb43d8b31f6f90ed8f2d"} Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.060849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3","Type":"ContainerStarted","Data":"10b59a7d3bcea6edd529a61d6ddb1239519bf3cbfbf06ffef4e16163a5132d83"} Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.083197 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.124218441 podStartE2EDuration="7.083175553s" podCreationTimestamp="2026-02-19 19:31:12 +0000 UTC" firstStartedPulling="2026-02-19 19:31:15.917370237 +0000 UTC m=+775.529720571" lastFinishedPulling="2026-02-19 19:31:18.876327359 +0000 UTC m=+778.488677683" observedRunningTime="2026-02-19 19:31:19.075242226 +0000 UTC m=+778.687592550" watchObservedRunningTime="2026-02-19 19:31:19.083175553 +0000 UTC m=+778.695525877" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.316789 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.449811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"4f50f1aa-154d-409a-826d-c6c4b3c75559\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.450066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"4f50f1aa-154d-409a-826d-c6c4b3c75559\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.450203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"4f50f1aa-154d-409a-826d-c6c4b3c75559\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.450975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle" (OuterVolumeSpecName: "bundle") pod "4f50f1aa-154d-409a-826d-c6c4b3c75559" (UID: "4f50f1aa-154d-409a-826d-c6c4b3c75559"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.459996 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util" (OuterVolumeSpecName: "util") pod "4f50f1aa-154d-409a-826d-c6c4b3c75559" (UID: "4f50f1aa-154d-409a-826d-c6c4b3c75559"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.460463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz" (OuterVolumeSpecName: "kube-api-access-jb2gz") pod "4f50f1aa-154d-409a-826d-c6c4b3c75559" (UID: "4f50f1aa-154d-409a-826d-c6c4b3c75559"). InnerVolumeSpecName "kube-api-access-jb2gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.551868 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.551905 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.551917 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:20 crc kubenswrapper[4722]: I0219 19:31:20.068725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4"} Feb 19 19:31:20 crc kubenswrapper[4722]: I0219 19:31:20.069103 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4" Feb 19 19:31:20 crc kubenswrapper[4722]: I0219 19:31:20.068757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:21 crc kubenswrapper[4722]: I0219 19:31:21.599763 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:21 crc kubenswrapper[4722]: I0219 19:31:21.599805 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:22 crc kubenswrapper[4722]: I0219 19:31:22.658544 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jss6p" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" probeResult="failure" output=< Feb 19 19:31:22 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:31:22 crc kubenswrapper[4722]: > Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.881985 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df"] Feb 19 19:31:25 crc kubenswrapper[4722]: E0219 19:31:25.882531 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="pull" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882546 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="pull" Feb 19 19:31:25 crc kubenswrapper[4722]: E0219 19:31:25.882568 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="util" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882575 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="util" Feb 19 19:31:25 crc kubenswrapper[4722]: E0219 19:31:25.882587 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="extract" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882598 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="extract" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882707 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="extract" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.884348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.891825 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.891825 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.891907 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.892138 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.902504 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-zpvtt" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.906228 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.914580 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df"] Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-manager-config\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-webhook-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-apiservice-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgsb\" (UniqueName: \"kubernetes.io/projected/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-kube-api-access-psgsb\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-manager-config\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-webhook-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-apiservice-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgsb\" (UniqueName: \"kubernetes.io/projected/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-kube-api-access-psgsb\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.150977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-manager-config\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.155784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-apiservice-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.157420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.157666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-webhook-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.168115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgsb\" (UniqueName: \"kubernetes.io/projected/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-kube-api-access-psgsb\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.216418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.640277 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df"] Feb 19 19:31:27 crc kubenswrapper[4722]: I0219 19:31:27.108806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" event={"ID":"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda","Type":"ContainerStarted","Data":"dfcdd4446bf858ddbcbafc5ef4e299589c41e0a6c5fe10920a0d133701b9d861"} Feb 19 19:31:31 crc kubenswrapper[4722]: I0219 19:31:31.651830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:31 crc kubenswrapper[4722]: I0219 19:31:31.716896 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:32 crc kubenswrapper[4722]: I0219 19:31:32.138464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" event={"ID":"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda","Type":"ContainerStarted","Data":"defeaf615a303d690e3a8a05bc148bd973ec1e75c2e76fe14f14c0734b426ee8"} Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.041488 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.042767 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jss6p" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" containerID="cri-o://bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5" gracePeriod=2 Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.219809 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerID="bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5" exitCode=0 Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.219852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5"} Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.493080 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.689803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"ac91b740-cc99-49ce-bda9-e209dfa22140\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.690980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"ac91b740-cc99-49ce-bda9-e209dfa22140\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.691088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"ac91b740-cc99-49ce-bda9-e209dfa22140\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.695913 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh" (OuterVolumeSpecName: "kube-api-access-5mlmh") pod "ac91b740-cc99-49ce-bda9-e209dfa22140" (UID: "ac91b740-cc99-49ce-bda9-e209dfa22140"). InnerVolumeSpecName "kube-api-access-5mlmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.701929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities" (OuterVolumeSpecName: "utilities") pod "ac91b740-cc99-49ce-bda9-e209dfa22140" (UID: "ac91b740-cc99-49ce-bda9-e209dfa22140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.791806 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.792377 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.809842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac91b740-cc99-49ce-bda9-e209dfa22140" (UID: "ac91b740-cc99-49ce-bda9-e209dfa22140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.896282 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.230745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb"} Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.230803 4722 scope.go:117] "RemoveContainer" containerID="bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5" Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.230838 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.250663 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.257397 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:37 crc kubenswrapper[4722]: I0219 19:31:37.081823 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" path="/var/lib/kubelet/pods/ac91b740-cc99-49ce-bda9-e209dfa22140/volumes" Feb 19 19:31:37 crc kubenswrapper[4722]: I0219 19:31:37.276293 4722 scope.go:117] "RemoveContainer" containerID="8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad" Feb 19 19:31:38 crc kubenswrapper[4722]: I0219 19:31:38.073772 4722 scope.go:117] "RemoveContainer" containerID="01d1ca82849ceb3bfa6f7f0cd551b947f9fd07718242afa579460d54fe5bc317" Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.255947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" event={"ID":"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda","Type":"ContainerStarted","Data":"50cf33f9ef0af83ea86e54b9f79085493210ffbf3ad29f62bb735aa9b1c53483"} Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.256470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.262224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.290206 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" podStartSLOduration=2.783032519 podStartE2EDuration="14.290184538s" podCreationTimestamp="2026-02-19 19:31:25 +0000 UTC" firstStartedPulling="2026-02-19 19:31:26.64438746 +0000 UTC m=+786.256737794" lastFinishedPulling="2026-02-19 19:31:38.151539489 +0000 UTC m=+797.763889813" observedRunningTime="2026-02-19 19:31:39.289009162 +0000 UTC m=+798.901359496" watchObservedRunningTime="2026-02-19 19:31:39.290184538 +0000 UTC m=+798.902534902" Feb 19 19:31:41 crc kubenswrapper[4722]: I0219 19:31:41.798065 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:31:41 crc kubenswrapper[4722]: I0219 19:31:41.798119 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.967772 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2"] Feb 19 19:32:10 crc kubenswrapper[4722]: E0219 19:32:10.969304 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-utilities" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969381 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-utilities" Feb 19 19:32:10 crc kubenswrapper[4722]: E0219 19:32:10.969452 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-content" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969511 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-content" Feb 19 19:32:10 crc kubenswrapper[4722]: E0219 19:32:10.969563 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969612 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969755 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.970550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.972690 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.979415 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2"] Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.980274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.980388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.980433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.082431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.082784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.082847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.083393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.083831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.105864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.288992 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798078 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798472 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798514 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798920 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798975 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84" gracePeriod=600 Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.868097 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2"] Feb 19 19:32:11 crc kubenswrapper[4722]: W0219 19:32:11.876565 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e5779bd_c885_4bc1_8f8d_924b571e2851.slice/crio-901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671 WatchSource:0}: Error finding container 901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671: Status 404 returned error can't find the container with id 901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671 Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.492981 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerID="9d50fcdd98a825fa5ed4491ac4173ffc494f21622f9c50f1386e71c38a51761f" exitCode=0 Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.493075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"9d50fcdd98a825fa5ed4491ac4173ffc494f21622f9c50f1386e71c38a51761f"} Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.493381 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerStarted","Data":"901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671"} Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.495165 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84" exitCode=0 Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.495177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84"} Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.495215 4722 scope.go:117] "RemoveContainer" containerID="ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c" Feb 19 19:32:13 crc kubenswrapper[4722]: I0219 19:32:13.501362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1"} Feb 19 19:32:14 crc kubenswrapper[4722]: I0219 19:32:14.511037 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerID="07d83f2a8566df6f2cb63f1cee655e08614f8aab331873f14d7ed91c61dc7276" exitCode=0 Feb 19 19:32:14 crc kubenswrapper[4722]: I0219 19:32:14.511190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"07d83f2a8566df6f2cb63f1cee655e08614f8aab331873f14d7ed91c61dc7276"} Feb 19 19:32:15 crc kubenswrapper[4722]: I0219 19:32:15.522143 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerID="0a1b4d448647d75e417bb8f0254c921c4479419fa3511312e2dc3f1bd4121724" exitCode=0 Feb 19 19:32:15 crc kubenswrapper[4722]: I0219 19:32:15.522233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"0a1b4d448647d75e417bb8f0254c921c4479419fa3511312e2dc3f1bd4121724"} Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.833464 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.974423 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"9e5779bd-c885-4bc1-8f8d-924b571e2851\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.974517 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"9e5779bd-c885-4bc1-8f8d-924b571e2851\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.974601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"9e5779bd-c885-4bc1-8f8d-924b571e2851\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.975103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle" (OuterVolumeSpecName: "bundle") pod "9e5779bd-c885-4bc1-8f8d-924b571e2851" (UID: "9e5779bd-c885-4bc1-8f8d-924b571e2851"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.978926 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv" (OuterVolumeSpecName: "kube-api-access-dxpmv") pod "9e5779bd-c885-4bc1-8f8d-924b571e2851" (UID: "9e5779bd-c885-4bc1-8f8d-924b571e2851"). InnerVolumeSpecName "kube-api-access-dxpmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.994412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util" (OuterVolumeSpecName: "util") pod "9e5779bd-c885-4bc1-8f8d-924b571e2851" (UID: "9e5779bd-c885-4bc1-8f8d-924b571e2851"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.075388 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.075418 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.075429 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") on node \"crc\" DevicePath \"\"" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.538802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671"} Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.538847 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.538850 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.779617 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-hclph"] Feb 19 19:32:20 crc kubenswrapper[4722]: E0219 19:32:20.780126 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="pull" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780139 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="pull" Feb 19 19:32:20 crc kubenswrapper[4722]: E0219 19:32:20.780187 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="util" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780194 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="util" Feb 19 19:32:20 crc kubenswrapper[4722]: E0219 19:32:20.780205 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="extract" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780212 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="extract" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780301 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="extract" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.783608 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.783843 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.784189 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jbjc5" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.796459 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-hclph"] Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.925638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjg8\" (UniqueName: \"kubernetes.io/projected/296e010f-202c-4c01-836e-be6c48607e5f-kube-api-access-ttjg8\") pod \"nmstate-operator-694c9596b7-hclph\" (UID: \"296e010f-202c-4c01-836e-be6c48607e5f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.027237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjg8\" (UniqueName: \"kubernetes.io/projected/296e010f-202c-4c01-836e-be6c48607e5f-kube-api-access-ttjg8\") pod \"nmstate-operator-694c9596b7-hclph\" (UID: \"296e010f-202c-4c01-836e-be6c48607e5f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.046434 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.056430 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.073928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjg8\" (UniqueName: \"kubernetes.io/projected/296e010f-202c-4c01-836e-be6c48607e5f-kube-api-access-ttjg8\") pod \"nmstate-operator-694c9596b7-hclph\" (UID: \"296e010f-202c-4c01-836e-be6c48607e5f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.105508 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jbjc5" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.114783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.371805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-hclph"] Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.565900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" event={"ID":"296e010f-202c-4c01-836e-be6c48607e5f","Type":"ContainerStarted","Data":"d5243cccd00de87c4124189f9a339f018587a00e0d850dfad89108f7a6b4ecf3"} Feb 19 19:32:24 crc kubenswrapper[4722]: I0219 19:32:24.587191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" event={"ID":"296e010f-202c-4c01-836e-be6c48607e5f","Type":"ContainerStarted","Data":"7bfc2b5f20d83c9f10af90bd43fbf2a133f2fb3f71994ad7bee77b6c7296bb51"} Feb 19 19:32:24 crc kubenswrapper[4722]: I0219 19:32:24.618002 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" podStartSLOduration=2.557205976 podStartE2EDuration="4.617976198s" podCreationTimestamp="2026-02-19 19:32:20 +0000 UTC" firstStartedPulling="2026-02-19 19:32:21.382650739 +0000 UTC m=+840.995001063" lastFinishedPulling="2026-02-19 19:32:23.443420951 +0000 UTC m=+843.055771285" observedRunningTime="2026-02-19 19:32:24.610566776 +0000 UTC m=+844.222917140" watchObservedRunningTime="2026-02-19 19:32:24.617976198 +0000 UTC m=+844.230326552" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.656279 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.657407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.659737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wld46" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.662290 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.662965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.667259 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.671093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-tvslw"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.673487 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.680002 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.701676 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.782911 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.783613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.786426 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.786716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lkr6n" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787664 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-ovs-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-dbus-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787748 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjd8p\" (UniqueName: \"kubernetes.io/projected/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-kube-api-access-gjd8p\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-nmstate-lock\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787804 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2zlp\" (UniqueName: \"kubernetes.io/projected/f9185385-162a-40a7-9563-3c668080b9e9-kube-api-access-h2zlp\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6nsl\" (UniqueName: \"kubernetes.io/projected/62ed738c-2401-4b21-b6a8-1bc2c1c009ae-kube-api-access-b6nsl\") pod \"nmstate-metrics-58c85c668d-t5lsr\" (UID: \"62ed738c-2401-4b21-b6a8-1bc2c1c009ae\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9185385-162a-40a7-9563-3c668080b9e9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787998 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.811460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-nmstate-lock\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889730 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2zlp\" (UniqueName: \"kubernetes.io/projected/f9185385-162a-40a7-9563-3c668080b9e9-kube-api-access-h2zlp\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6nsl\" (UniqueName: \"kubernetes.io/projected/62ed738c-2401-4b21-b6a8-1bc2c1c009ae-kube-api-access-b6nsl\") pod \"nmstate-metrics-58c85c668d-t5lsr\" (UID: \"62ed738c-2401-4b21-b6a8-1bc2c1c009ae\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4g6\" (UniqueName: \"kubernetes.io/projected/ed131fa7-525a-481d-83a9-4fef817dc7ce-kube-api-access-xc4g6\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9185385-162a-40a7-9563-3c668080b9e9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-ovs-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-dbus-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed131fa7-525a-481d-83a9-4fef817dc7ce-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjd8p\" (UniqueName: \"kubernetes.io/projected/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-kube-api-access-gjd8p\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.890351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-nmstate-lock\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.891433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-dbus-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.891870 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-ovs-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.896122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9185385-162a-40a7-9563-3c668080b9e9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.906393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2zlp\" (UniqueName: \"kubernetes.io/projected/f9185385-162a-40a7-9563-3c668080b9e9-kube-api-access-h2zlp\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.906609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjd8p\" (UniqueName: \"kubernetes.io/projected/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-kube-api-access-gjd8p\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.909175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6nsl\" (UniqueName: \"kubernetes.io/projected/62ed738c-2401-4b21-b6a8-1bc2c1c009ae-kube-api-access-b6nsl\") pod \"nmstate-metrics-58c85c668d-t5lsr\" (UID: \"62ed738c-2401-4b21-b6a8-1bc2c1c009ae\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.986576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.989317 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb979c56d-ddvr6"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed131fa7-525a-481d-83a9-4fef817dc7ce-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4g6\" (UniqueName: \"kubernetes.io/projected/ed131fa7-525a-481d-83a9-4fef817dc7ce-kube-api-access-xc4g6\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: E0219 19:32:25.991063 4722 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 19:32:25 crc kubenswrapper[4722]: E0219 19:32:25.991181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert podName:ed131fa7-525a-481d-83a9-4fef817dc7ce nodeName:}" failed. No retries permitted until 2026-02-19 19:32:26.491116217 +0000 UTC m=+846.103466541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-nlx9v" (UID: "ed131fa7-525a-481d-83a9-4fef817dc7ce") : secret "plugin-serving-cert" not found Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.992074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed131fa7-525a-481d-83a9-4fef817dc7ce-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.006436 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb979c56d-ddvr6"] Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.037930 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.040214 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.047829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4g6\" (UniqueName: \"kubernetes.io/projected/ed131fa7-525a-481d-83a9-4fef817dc7ce-kube-api-access-xc4g6\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.091950 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-oauth-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-service-ca\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-oauth-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-console-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092437 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-trusted-ca-bundle\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8sf\" (UniqueName: \"kubernetes.io/projected/defd195c-f260-424a-8740-be368c4d8e64-kube-api-access-km8sf\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-oauth-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-service-ca\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194345 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-oauth-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-trusted-ca-bundle\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-console-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8sf\" (UniqueName: \"kubernetes.io/projected/defd195c-f260-424a-8740-be368c4d8e64-kube-api-access-km8sf\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.197281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-service-ca\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.197429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-oauth-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.199415 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-console-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.199741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-oauth-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.199976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.200456 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-trusted-ca-bundle\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.216086 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8sf\" (UniqueName: \"kubernetes.io/projected/defd195c-f260-424a-8740-be368c4d8e64-kube-api-access-km8sf\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: W0219 19:32:26.269759 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9185385_162a_40a7_9563_3c668080b9e9.slice/crio-7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d WatchSource:0}: Error finding container 7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d: Status 404 returned error can't find the container with id 7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.270079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv"] Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.341693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.415091 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr"] Feb 19 19:32:26 crc kubenswrapper[4722]: W0219 19:32:26.425402 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ed738c_2401_4b21_b6a8_1bc2c1c009ae.slice/crio-accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28 WatchSource:0}: Error finding container accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28: Status 404 returned error can't find the container with id accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28 Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.498402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.504839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.580301 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb979c56d-ddvr6"] Feb 19 19:32:26 crc kubenswrapper[4722]: W0219 19:32:26.585281 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefd195c_f260_424a_8740_be368c4d8e64.slice/crio-ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974 WatchSource:0}: Error finding container ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974: Status 404 returned error can't find the container with id ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974 Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.597688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" event={"ID":"f9185385-162a-40a7-9563-3c668080b9e9","Type":"ContainerStarted","Data":"7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.599536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" event={"ID":"62ed738c-2401-4b21-b6a8-1bc2c1c009ae","Type":"ContainerStarted","Data":"accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.600476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb979c56d-ddvr6" event={"ID":"defd195c-f260-424a-8740-be368c4d8e64","Type":"ContainerStarted","Data":"ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.601673 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tvslw" event={"ID":"59139bb2-e1ae-4f74-96fe-6ea34d232cd9","Type":"ContainerStarted","Data":"eb68fa767886eb1ca3412bb8d8c2dcb200b27b72c65526020742dad4f4733791"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.708515 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:27 crc kubenswrapper[4722]: I0219 19:32:27.166978 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v"] Feb 19 19:32:27 crc kubenswrapper[4722]: W0219 19:32:27.174893 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded131fa7_525a_481d_83a9_4fef817dc7ce.slice/crio-251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7 WatchSource:0}: Error finding container 251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7: Status 404 returned error can't find the container with id 251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7 Feb 19 19:32:27 crc kubenswrapper[4722]: I0219 19:32:27.608212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" event={"ID":"ed131fa7-525a-481d-83a9-4fef817dc7ce","Type":"ContainerStarted","Data":"251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7"} Feb 19 19:32:27 crc kubenswrapper[4722]: I0219 19:32:27.610036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb979c56d-ddvr6" event={"ID":"defd195c-f260-424a-8740-be368c4d8e64","Type":"ContainerStarted","Data":"17d85fd0f3f57c1dfcdfebd6b1039b0749f82847f1f7ca46cf88cb343f6fc399"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.623249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" event={"ID":"f9185385-162a-40a7-9563-3c668080b9e9","Type":"ContainerStarted","Data":"b0f581c2a48df492c939cb59b8a40f93b0c5768dd79cbf576b2b46af03eaf9d7"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.624681 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.626498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" event={"ID":"62ed738c-2401-4b21-b6a8-1bc2c1c009ae","Type":"ContainerStarted","Data":"40b54ede01293a2bee2ff3af26cb470b08b4fa287df75102887c47ce592bc0c7"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.628120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tvslw" event={"ID":"59139bb2-e1ae-4f74-96fe-6ea34d232cd9","Type":"ContainerStarted","Data":"cbd9cab1a56cecca803ffe0d28cfd983624c3d43e32dce7df5eb02641fc9a290"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.628585 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.644145 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb979c56d-ddvr6" podStartSLOduration=4.644126737 podStartE2EDuration="4.644126737s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:32:27.641885135 +0000 UTC m=+847.254235479" watchObservedRunningTime="2026-02-19 19:32:29.644126737 +0000 UTC m=+849.256477051" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.649649 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" podStartSLOduration=2.240042429 podStartE2EDuration="4.649631219s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:26.271927301 +0000 UTC m=+845.884277625" lastFinishedPulling="2026-02-19 19:32:28.681516091 +0000 UTC m=+848.293866415" observedRunningTime="2026-02-19 19:32:29.639734951 +0000 UTC m=+849.252085285" watchObservedRunningTime="2026-02-19 19:32:29.649631219 +0000 UTC m=+849.261981543" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.664614 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-tvslw" podStartSLOduration=2.08516175 podStartE2EDuration="4.664516333s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:26.100736984 +0000 UTC m=+845.713087298" lastFinishedPulling="2026-02-19 19:32:28.680091557 +0000 UTC m=+848.292441881" observedRunningTime="2026-02-19 19:32:29.657514275 +0000 UTC m=+849.269864599" watchObservedRunningTime="2026-02-19 19:32:29.664516333 +0000 UTC m=+849.276866657" Feb 19 19:32:30 crc kubenswrapper[4722]: I0219 19:32:30.640619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" event={"ID":"ed131fa7-525a-481d-83a9-4fef817dc7ce","Type":"ContainerStarted","Data":"905b96133533a1f6cf98023a1c82527ee542932bd35c2639d5ebc67e4f1d586b"} Feb 19 19:32:30 crc kubenswrapper[4722]: I0219 19:32:30.665483 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" podStartSLOduration=3.0373082 podStartE2EDuration="5.665457054s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:27.178009965 +0000 UTC m=+846.790360289" lastFinishedPulling="2026-02-19 19:32:29.806158819 +0000 UTC m=+849.418509143" observedRunningTime="2026-02-19 19:32:30.662177572 +0000 UTC m=+850.274527906" watchObservedRunningTime="2026-02-19 19:32:30.665457054 +0000 UTC m=+850.277807398" Feb 19 19:32:32 crc kubenswrapper[4722]: I0219 19:32:32.655660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" event={"ID":"62ed738c-2401-4b21-b6a8-1bc2c1c009ae","Type":"ContainerStarted","Data":"1eb4b5b440eaadd46c6dc6572c1b7199002aa8c187d6866815521a60b4fe01c9"} Feb 19 19:32:32 crc kubenswrapper[4722]: I0219 19:32:32.697003 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" podStartSLOduration=2.338724354 podStartE2EDuration="7.68351872s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:26.428106619 +0000 UTC m=+846.040456993" lastFinishedPulling="2026-02-19 19:32:31.772901025 +0000 UTC m=+851.385251359" observedRunningTime="2026-02-19 19:32:32.67999752 +0000 UTC m=+852.292347914" watchObservedRunningTime="2026-02-19 19:32:32.68351872 +0000 UTC m=+852.295869074" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.068642 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.342540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.342621 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.347892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.690659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.770844 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:32:46 crc kubenswrapper[4722]: I0219 19:32:46.048960 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.843679 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.848585 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.873225 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.986659 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.986719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.986797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.087953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.088320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.088371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.089168 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.089220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.112055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.169631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.686878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.849857 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e454faa-cee8-4571-88ed-88bb048abe32" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" exitCode=0 Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.849908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2"} Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.849958 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerStarted","Data":"139202899fc571b300add407645fb64230d610b9266e7028671d6d1cc0159fda"} Feb 19 19:32:59 crc kubenswrapper[4722]: I0219 19:32:59.857258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerStarted","Data":"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191"} Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.252432 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb"] Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.253682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.255598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.260909 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb"] Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.427516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.427597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.427668 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528368 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.547347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.569856 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.865376 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e454faa-cee8-4571-88ed-88bb048abe32" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" exitCode=0 Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.865439 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.008687 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb"] Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.826518 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-txlzt" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" containerID="cri-o://25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" gracePeriod=15 Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.873576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerStarted","Data":"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.876011 4722 generic.go:334] "Generic (PLEG): container finished" podID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerID="33c31fb6cece00c6ce8a25f27e4fbb1f073a2b8beaacb0aa68fbf04528f42ba4" exitCode=0 Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.876053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"33c31fb6cece00c6ce8a25f27e4fbb1f073a2b8beaacb0aa68fbf04528f42ba4"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.876078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerStarted","Data":"e509223e66336b2f8aea7b9aeca56637052efbf0782b27bea70243047cae786c"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.891900 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5t95p" podStartSLOduration=2.439929147 podStartE2EDuration="4.89188119s" podCreationTimestamp="2026-02-19 19:32:57 +0000 UTC" firstStartedPulling="2026-02-19 19:32:58.852032992 +0000 UTC m=+878.464383316" lastFinishedPulling="2026-02-19 19:33:01.303985035 +0000 UTC m=+880.916335359" observedRunningTime="2026-02-19 19:33:01.890465646 +0000 UTC m=+881.502816000" watchObservedRunningTime="2026-02-19 19:33:01.89188119 +0000 UTC m=+881.504231524" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.228819 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-txlzt_187676b8-1029-4153-9da5-6614e9b7892e/console/0.log" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.229110 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352649 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca" (OuterVolumeSpecName: "service-ca") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353735 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353748 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config" (OuterVolumeSpecName: "console-config") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.358678 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.358898 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9" (OuterVolumeSpecName: "kube-api-access-hqdj9") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "kube-api-access-hqdj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.358974 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454183 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454221 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454232 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454242 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454252 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454261 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454273 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882204 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-txlzt_187676b8-1029-4153-9da5-6614e9b7892e/console/0.log" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882246 4722 generic.go:334] "Generic (PLEG): container finished" podID="187676b8-1029-4153-9da5-6614e9b7892e" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" exitCode=2 Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerDied","Data":"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af"} Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882329 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882375 4722 scope.go:117] "RemoveContainer" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerDied","Data":"fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3"} Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.907366 4722 scope.go:117] "RemoveContainer" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" Feb 19 19:33:02 crc kubenswrapper[4722]: E0219 19:33:02.907930 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af\": container with ID starting with 25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af not found: ID does not exist" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.907999 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af"} err="failed to get container status \"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af\": rpc error: code = NotFound desc = could not find container \"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af\": container with ID starting with 25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af not found: ID does not exist" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.924795 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.936241 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:33:03 crc kubenswrapper[4722]: I0219 19:33:03.081100 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187676b8-1029-4153-9da5-6614e9b7892e" path="/var/lib/kubelet/pods/187676b8-1029-4153-9da5-6614e9b7892e/volumes" Feb 19 19:33:03 crc kubenswrapper[4722]: I0219 19:33:03.895048 4722 generic.go:334] "Generic (PLEG): container finished" podID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerID="35b389162c760bf1c3e490310072541d2133d47a3ddc8b348105d9db972ad459" exitCode=0 Feb 19 19:33:03 crc kubenswrapper[4722]: I0219 19:33:03.895243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"35b389162c760bf1c3e490310072541d2133d47a3ddc8b348105d9db972ad459"} Feb 19 19:33:04 crc kubenswrapper[4722]: I0219 19:33:04.911410 4722 generic.go:334] "Generic (PLEG): container finished" podID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerID="fbba620dd77635b9ebc6fb1d20fb41377dd7eda8493d11165e9dd8721dd04bc9" exitCode=0 Feb 19 19:33:04 crc kubenswrapper[4722]: I0219 19:33:04.911484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"fbba620dd77635b9ebc6fb1d20fb41377dd7eda8493d11165e9dd8721dd04bc9"} Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.188269 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.301361 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.301403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.301465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.302636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle" (OuterVolumeSpecName: "bundle") pod "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" (UID: "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.309307 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k" (OuterVolumeSpecName: "kube-api-access-pj62k") pod "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" (UID: "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72"). InnerVolumeSpecName "kube-api-access-pj62k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.402349 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.402384 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.657454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util" (OuterVolumeSpecName: "util") pod "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" (UID: "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.706346 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.929471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"e509223e66336b2f8aea7b9aeca56637052efbf0782b27bea70243047cae786c"} Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.929538 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.929542 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e509223e66336b2f8aea7b9aeca56637052efbf0782b27bea70243047cae786c" Feb 19 19:33:08 crc kubenswrapper[4722]: I0219 19:33:08.170920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:08 crc kubenswrapper[4722]: I0219 19:33:08.170972 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:08 crc kubenswrapper[4722]: I0219 19:33:08.213054 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:09 crc kubenswrapper[4722]: I0219 19:33:09.019073 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:10 crc kubenswrapper[4722]: I0219 19:33:10.217110 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:33:10 crc kubenswrapper[4722]: I0219 19:33:10.955061 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5t95p" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" containerID="cri-o://78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" gracePeriod=2 Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.339118 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.378754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"9e454faa-cee8-4571-88ed-88bb048abe32\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.378962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"9e454faa-cee8-4571-88ed-88bb048abe32\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.383422 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq" (OuterVolumeSpecName: "kube-api-access-c65pq") pod "9e454faa-cee8-4571-88ed-88bb048abe32" (UID: "9e454faa-cee8-4571-88ed-88bb048abe32"). InnerVolumeSpecName "kube-api-access-c65pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.390021 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities" (OuterVolumeSpecName: "utilities") pod "9e454faa-cee8-4571-88ed-88bb048abe32" (UID: "9e454faa-cee8-4571-88ed-88bb048abe32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.480428 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"9e454faa-cee8-4571-88ed-88bb048abe32\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.480868 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.480901 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.530239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e454faa-cee8-4571-88ed-88bb048abe32" (UID: "9e454faa-cee8-4571-88ed-88bb048abe32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.582187 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963420 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e454faa-cee8-4571-88ed-88bb048abe32" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" exitCode=0 Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b"} Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"139202899fc571b300add407645fb64230d610b9266e7028671d6d1cc0159fda"} Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963541 4722 scope.go:117] "RemoveContainer" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963705 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.005825 4722 scope.go:117] "RemoveContainer" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.019111 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.024892 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.029931 4722 scope.go:117] "RemoveContainer" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.052130 4722 scope.go:117] "RemoveContainer" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" Feb 19 19:33:12 crc kubenswrapper[4722]: E0219 19:33:12.052602 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b\": container with ID starting with 78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b not found: ID does not exist" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.052642 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b"} err="failed to get container status \"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b\": rpc error: code = NotFound desc = could not find container \"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b\": container with ID starting with 78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b not found: ID does not exist" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.052665 4722 scope.go:117] "RemoveContainer" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" Feb 19 19:33:12 crc kubenswrapper[4722]: E0219 19:33:12.052982 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191\": container with ID starting with a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191 not found: ID does not exist" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.053002 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191"} err="failed to get container status \"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191\": rpc error: code = NotFound desc = could not find container \"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191\": container with ID starting with a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191 not found: ID does not exist" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.053015 4722 scope.go:117] "RemoveContainer" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" Feb 19 19:33:12 crc kubenswrapper[4722]: E0219 19:33:12.053309 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2\": container with ID starting with c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2 not found: ID does not exist" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.053328 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2"} err="failed to get container status \"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2\": rpc error: code = NotFound desc = could not find container \"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2\": container with ID starting with c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2 not found: ID does not exist" Feb 19 19:33:13 crc kubenswrapper[4722]: I0219 19:33:13.077904 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" path="/var/lib/kubelet/pods/9e454faa-cee8-4571-88ed-88bb048abe32/volumes" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386570 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx"] Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386855 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-content" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386873 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-content" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386891 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="pull" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386899 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="pull" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386918 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386934 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386943 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386961 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-utilities" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386970 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-utilities" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386984 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="util" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386991 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="util" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.387002 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="extract" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387010 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="extract" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387130 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="extract" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387145 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387182 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.390104 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.390757 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.392791 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ntbjt" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.393315 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.399983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.433421 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx"] Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.518925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-webhook-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.518996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr4kk\" (UniqueName: \"kubernetes.io/projected/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-kube-api-access-lr4kk\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.519087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-apiservice-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.619920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr4kk\" (UniqueName: \"kubernetes.io/projected/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-kube-api-access-lr4kk\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.620259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-apiservice-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.620384 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-webhook-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.626251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-webhook-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.626266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-apiservice-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.642921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr4kk\" (UniqueName: \"kubernetes.io/projected/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-kube-api-access-lr4kk\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.702769 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.723210 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2"] Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.724069 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.725755 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.725930 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.727141 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9rxqc" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.746833 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2"] Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.829844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s7c6\" (UniqueName: \"kubernetes.io/projected/02eda63c-5131-407e-bb2e-7ad0adf0e985-kube-api-access-4s7c6\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.829922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-webhook-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.829960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-apiservice-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.931078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-webhook-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.931457 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-apiservice-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.931528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s7c6\" (UniqueName: \"kubernetes.io/projected/02eda63c-5131-407e-bb2e-7ad0adf0e985-kube-api-access-4s7c6\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.936675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-webhook-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.936686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-apiservice-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.949061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s7c6\" (UniqueName: \"kubernetes.io/projected/02eda63c-5131-407e-bb2e-7ad0adf0e985-kube-api-access-4s7c6\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.080126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.176783 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx"] Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.487396 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2"] Feb 19 19:33:15 crc kubenswrapper[4722]: W0219 19:33:15.492946 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02eda63c_5131_407e_bb2e_7ad0adf0e985.slice/crio-6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661 WatchSource:0}: Error finding container 6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661: Status 404 returned error can't find the container with id 6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661 Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.989952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" event={"ID":"f41ca32e-24fc-427a-a2bc-76e4d5abba0f","Type":"ContainerStarted","Data":"7321c210129aad1c086d6637f960184aa78d4127678701faa27f6affe6443088"} Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.991452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" event={"ID":"02eda63c-5131-407e-bb2e-7ad0adf0e985","Type":"ContainerStarted","Data":"6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661"} Feb 19 19:33:20 crc kubenswrapper[4722]: I0219 19:33:20.040663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" event={"ID":"f41ca32e-24fc-427a-a2bc-76e4d5abba0f","Type":"ContainerStarted","Data":"fc23e83eb336b28c2e17c393241d9a071d489ddd2e45d48654dee175777e1d11"} Feb 19 19:33:20 crc kubenswrapper[4722]: I0219 19:33:20.043063 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:20 crc kubenswrapper[4722]: I0219 19:33:20.077421 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" podStartSLOduration=2.076305992 podStartE2EDuration="6.077404282s" podCreationTimestamp="2026-02-19 19:33:14 +0000 UTC" firstStartedPulling="2026-02-19 19:33:15.197790687 +0000 UTC m=+894.810141011" lastFinishedPulling="2026-02-19 19:33:19.198888977 +0000 UTC m=+898.811239301" observedRunningTime="2026-02-19 19:33:20.075736679 +0000 UTC m=+899.688087013" watchObservedRunningTime="2026-02-19 19:33:20.077404282 +0000 UTC m=+899.689754606" Feb 19 19:33:22 crc kubenswrapper[4722]: I0219 19:33:22.056040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" event={"ID":"02eda63c-5131-407e-bb2e-7ad0adf0e985","Type":"ContainerStarted","Data":"913eab5307a521c0eddf7939b7a2e5bf07dc152e28955b9917626731e0702e53"} Feb 19 19:33:22 crc kubenswrapper[4722]: I0219 19:33:22.056401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:22 crc kubenswrapper[4722]: I0219 19:33:22.078747 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" podStartSLOduration=2.536726962 podStartE2EDuration="8.078730076s" podCreationTimestamp="2026-02-19 19:33:14 +0000 UTC" firstStartedPulling="2026-02-19 19:33:15.496491017 +0000 UTC m=+895.108841341" lastFinishedPulling="2026-02-19 19:33:21.038494131 +0000 UTC m=+900.650844455" observedRunningTime="2026-02-19 19:33:22.075040001 +0000 UTC m=+901.687390325" watchObservedRunningTime="2026-02-19 19:33:22.078730076 +0000 UTC m=+901.691080400" Feb 19 19:33:35 crc kubenswrapper[4722]: I0219 19:33:35.084309 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.013775 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.015213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.028333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.031086 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.031489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.031514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.131969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.150686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.329927 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.803426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:33:49 crc kubenswrapper[4722]: W0219 19:33:49.807092 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdf0909_68d5_47d0_a7db_fb4b0badbb9e.slice/crio-da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83 WatchSource:0}: Error finding container da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83: Status 404 returned error can't find the container with id da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83 Feb 19 19:33:50 crc kubenswrapper[4722]: I0219 19:33:50.539824 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerStarted","Data":"da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83"} Feb 19 19:33:51 crc kubenswrapper[4722]: I0219 19:33:51.547627 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerID="af1a7852e6e36db3bfdab60b2f38a0b981bc6b497a8bfeafd40f05127561893a" exitCode=0 Feb 19 19:33:51 crc kubenswrapper[4722]: I0219 19:33:51.547674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"af1a7852e6e36db3bfdab60b2f38a0b981bc6b497a8bfeafd40f05127561893a"} Feb 19 19:33:52 crc kubenswrapper[4722]: I0219 19:33:52.557453 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerID="af96eb5f1e4c5f1724573bceb2f73b63f2d39809a28f76d211167409be3a723e" exitCode=0 Feb 19 19:33:52 crc kubenswrapper[4722]: I0219 19:33:52.557511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"af96eb5f1e4c5f1724573bceb2f73b63f2d39809a28f76d211167409be3a723e"} Feb 19 19:33:53 crc kubenswrapper[4722]: I0219 19:33:53.568075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerStarted","Data":"12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597"} Feb 19 19:33:53 crc kubenswrapper[4722]: I0219 19:33:53.591235 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8l56l" podStartSLOduration=4.171619009 podStartE2EDuration="5.59120538s" podCreationTimestamp="2026-02-19 19:33:48 +0000 UTC" firstStartedPulling="2026-02-19 19:33:51.55022259 +0000 UTC m=+931.162572934" lastFinishedPulling="2026-02-19 19:33:52.969808971 +0000 UTC m=+932.582159305" observedRunningTime="2026-02-19 19:33:53.588899209 +0000 UTC m=+933.201249543" watchObservedRunningTime="2026-02-19 19:33:53.59120538 +0000 UTC m=+933.203555754" Feb 19 19:33:54 crc kubenswrapper[4722]: I0219 19:33:54.704796 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.484048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.485439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.487087 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-krb58" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.488071 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.500690 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-92pkj"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.504013 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.507042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.507069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.509742 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.599330 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nnmrq"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.601333 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.603575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pr7mk" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.604268 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.604357 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.604901 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.613859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt27g\" (UniqueName: \"kubernetes.io/projected/505e06e7-65a2-4444-8552-8b96253c87fc-kube-api-access-pt27g\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.613949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-sockets\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.613979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bg8l\" (UniqueName: \"kubernetes.io/projected/4f25f2fe-8438-431d-9e9d-9efba0109efd-kube-api-access-5bg8l\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614047 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-startup\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics-certs\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-conf\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/505e06e7-65a2-4444-8552-8b96253c87fc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614165 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-reloader\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614306 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-h9kn7"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.621437 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.626108 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.634238 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-h9kn7"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/505e06e7-65a2-4444-8552-8b96253c87fc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-reloader\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt27g\" (UniqueName: \"kubernetes.io/projected/505e06e7-65a2-4444-8552-8b96253c87fc-kube-api-access-pt27g\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-sockets\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bg8l\" (UniqueName: \"kubernetes.io/projected/4f25f2fe-8438-431d-9e9d-9efba0109efd-kube-api-access-5bg8l\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1319426-40ee-40fc-86bf-64cca26d6860-metallb-excludel2\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5r5\" (UniqueName: \"kubernetes.io/projected/d1319426-40ee-40fc-86bf-64cca26d6860-kube-api-access-9f5r5\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-startup\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics-certs\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-conf\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-metrics-certs\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.718029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-reloader\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.718688 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-sockets\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.719113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-startup\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.719368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.721304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-conf\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.724616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics-certs\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.729568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/505e06e7-65a2-4444-8552-8b96253c87fc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.735825 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bg8l\" (UniqueName: \"kubernetes.io/projected/4f25f2fe-8438-431d-9e9d-9efba0109efd-kube-api-access-5bg8l\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.745848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt27g\" (UniqueName: \"kubernetes.io/projected/505e06e7-65a2-4444-8552-8b96253c87fc-kube-api-access-pt27g\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.801833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.818469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5r5\" (UniqueName: \"kubernetes.io/projected/d1319426-40ee-40fc-86bf-64cca26d6860-kube-api-access-9f5r5\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.818557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5jn\" (UniqueName: \"kubernetes.io/projected/1a80711d-831e-42ab-a5f8-6272eba9c635-kube-api-access-tr5jn\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.818660 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-metrics-certs\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-metrics-certs\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819283 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-cert\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: E0219 19:33:55.819371 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 19:33:55 crc kubenswrapper[4722]: E0219 19:33:55.819419 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist podName:d1319426-40ee-40fc-86bf-64cca26d6860 nodeName:}" failed. No retries permitted until 2026-02-19 19:33:56.319402867 +0000 UTC m=+935.931753191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist") pod "speaker-nnmrq" (UID: "d1319426-40ee-40fc-86bf-64cca26d6860") : secret "metallb-memberlist" not found Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1319426-40ee-40fc-86bf-64cca26d6860-metallb-excludel2\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.820330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1319426-40ee-40fc-86bf-64cca26d6860-metallb-excludel2\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.822880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-metrics-certs\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.824411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.838636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5r5\" (UniqueName: \"kubernetes.io/projected/d1319426-40ee-40fc-86bf-64cca26d6860-kube-api-access-9f5r5\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.925366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-metrics-certs\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.925680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-cert\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.925745 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5jn\" (UniqueName: \"kubernetes.io/projected/1a80711d-831e-42ab-a5f8-6272eba9c635-kube-api-access-tr5jn\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.928423 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.933608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-metrics-certs\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.939914 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-cert\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.940642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5jn\" (UniqueName: \"kubernetes.io/projected/1a80711d-831e-42ab-a5f8-6272eba9c635-kube-api-access-tr5jn\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.944436 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.031815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q"] Feb 19 19:33:56 crc kubenswrapper[4722]: W0219 19:33:56.035767 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod505e06e7_65a2_4444_8552_8b96253c87fc.slice/crio-043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654 WatchSource:0}: Error finding container 043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654: Status 404 returned error can't find the container with id 043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654 Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.179442 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-h9kn7"] Feb 19 19:33:56 crc kubenswrapper[4722]: W0219 19:33:56.185684 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a80711d_831e_42ab_a5f8_6272eba9c635.slice/crio-7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d WatchSource:0}: Error finding container 7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d: Status 404 returned error can't find the container with id 7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.331103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:56 crc kubenswrapper[4722]: E0219 19:33:56.331384 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 19:33:56 crc kubenswrapper[4722]: E0219 19:33:56.331447 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist podName:d1319426-40ee-40fc-86bf-64cca26d6860 nodeName:}" failed. No retries permitted until 2026-02-19 19:33:57.331430417 +0000 UTC m=+936.943780741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist") pod "speaker-nnmrq" (UID: "d1319426-40ee-40fc-86bf-64cca26d6860") : secret "metallb-memberlist" not found Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.587814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"152ab4c38c225521aba2e1acafed36b4d320cc55e57360f92a20c8bf2c2e533c"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-h9kn7" event={"ID":"1a80711d-831e-42ab-a5f8-6272eba9c635","Type":"ContainerStarted","Data":"998f7c8f2634ab7e8cc0231960f414c17fd751c700d756b06944184f969ef182"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-h9kn7" event={"ID":"1a80711d-831e-42ab-a5f8-6272eba9c635","Type":"ContainerStarted","Data":"c1d0e87d2acc19f08b4837c6c3fdf31552869b14af1c937474cb535c6a4eee34"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-h9kn7" event={"ID":"1a80711d-831e-42ab-a5f8-6272eba9c635","Type":"ContainerStarted","Data":"7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589606 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.590935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" event={"ID":"505e06e7-65a2-4444-8552-8b96253c87fc","Type":"ContainerStarted","Data":"043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.605927 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-h9kn7" podStartSLOduration=1.605905243 podStartE2EDuration="1.605905243s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:33:56.605906333 +0000 UTC m=+936.218256677" watchObservedRunningTime="2026-02-19 19:33:56.605905243 +0000 UTC m=+936.218255577" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.345100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.351143 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.420462 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nnmrq" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.607588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nnmrq" event={"ID":"d1319426-40ee-40fc-86bf-64cca26d6860","Type":"ContainerStarted","Data":"a489e743abc6ad17e094b0618b96f2aa426d7d5ea65b004f3707264d30afe920"} Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.616198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nnmrq" event={"ID":"d1319426-40ee-40fc-86bf-64cca26d6860","Type":"ContainerStarted","Data":"6769b8f85e991ab6c87acb8ca3c3bfe3542ebcaf6815969f8eb2cd6aeffa1c1a"} Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.616460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nnmrq" event={"ID":"d1319426-40ee-40fc-86bf-64cca26d6860","Type":"ContainerStarted","Data":"9ec8844df828a64613f7c9044c992d17068db4f154151cb9e9fa90a7f1057d2c"} Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.616491 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nnmrq" Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.634041 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nnmrq" podStartSLOduration=3.634026972 podStartE2EDuration="3.634026972s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:33:58.630982938 +0000 UTC m=+938.243333262" watchObservedRunningTime="2026-02-19 19:33:58.634026972 +0000 UTC m=+938.246377296" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.330160 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.330232 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.397210 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.656234 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.657456 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.672806 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.715224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.781469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.781621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.781643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887323 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.888612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.923487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.978416 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:00 crc kubenswrapper[4722]: I0219 19:34:00.459434 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:00 crc kubenswrapper[4722]: I0219 19:34:00.633568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerStarted","Data":"19f5d48196f47456f69c12a506337347dc985dee8f4eac6265950fed4107d051"} Feb 19 19:34:01 crc kubenswrapper[4722]: I0219 19:34:01.639377 4722 generic.go:334] "Generic (PLEG): container finished" podID="97a52bce-2539-405e-867d-922857a2ce75" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" exitCode=0 Feb 19 19:34:01 crc kubenswrapper[4722]: I0219 19:34:01.639421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230"} Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.040269 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.040840 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8l56l" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" containerID="cri-o://12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597" gracePeriod=2 Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.648020 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerID="12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597" exitCode=0 Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.648100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597"} Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.425485 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.466023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.466229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.466292 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.467531 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities" (OuterVolumeSpecName: "utilities") pod "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" (UID: "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.483514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6" (OuterVolumeSpecName: "kube-api-access-j6rz6") pod "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" (UID: "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e"). InnerVolumeSpecName "kube-api-access-j6rz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.507394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" (UID: "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.567493 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.567558 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.567568 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.667753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83"} Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.667810 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.668168 4722 scope.go:117] "RemoveContainer" containerID="12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.693532 4722 scope.go:117] "RemoveContainer" containerID="af96eb5f1e4c5f1724573bceb2f73b63f2d39809a28f76d211167409be3a723e" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.727673 4722 scope.go:117] "RemoveContainer" containerID="af1a7852e6e36db3bfdab60b2f38a0b981bc6b497a8bfeafd40f05127561893a" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.735063 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.740273 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.691625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" event={"ID":"505e06e7-65a2-4444-8552-8b96253c87fc","Type":"ContainerStarted","Data":"c96664872083f47d40ce3d34aaf5dc56bde7e0c53f5e2c85d94d1ab7dd6b8ec8"} Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.692274 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.696130 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f25f2fe-8438-431d-9e9d-9efba0109efd" containerID="7ea840970662d7f3f2ae6378d191226722efed6f477f17b743321f3bde30ca52" exitCode=0 Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.696412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerDied","Data":"7ea840970662d7f3f2ae6378d191226722efed6f477f17b743321f3bde30ca52"} Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.703992 4722 generic.go:334] "Generic (PLEG): container finished" podID="97a52bce-2539-405e-867d-922857a2ce75" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" exitCode=0 Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.704071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031"} Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.736827 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" podStartSLOduration=2.2306797879999998 podStartE2EDuration="11.736766727s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="2026-02-19 19:33:56.038054012 +0000 UTC m=+935.650404326" lastFinishedPulling="2026-02-19 19:34:05.544140931 +0000 UTC m=+945.156491265" observedRunningTime="2026-02-19 19:34:06.727704094 +0000 UTC m=+946.340054448" watchObservedRunningTime="2026-02-19 19:34:06.736766727 +0000 UTC m=+946.349117091" Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.079577 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" path="/var/lib/kubelet/pods/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e/volumes" Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.423966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nnmrq" Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.711034 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f25f2fe-8438-431d-9e9d-9efba0109efd" containerID="f664664bdbde67ddfaaa146d9801ae6dc0166fb9a4082a014c6c102abbbc4ed2" exitCode=0 Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.711124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerDied","Data":"f664664bdbde67ddfaaa146d9801ae6dc0166fb9a4082a014c6c102abbbc4ed2"} Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.714534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerStarted","Data":"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06"} Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.775585 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7894" podStartSLOduration=3.292267615 podStartE2EDuration="8.775558668s" podCreationTimestamp="2026-02-19 19:33:59 +0000 UTC" firstStartedPulling="2026-02-19 19:34:01.641123948 +0000 UTC m=+941.253474272" lastFinishedPulling="2026-02-19 19:34:07.124414981 +0000 UTC m=+946.736765325" observedRunningTime="2026-02-19 19:34:07.770184731 +0000 UTC m=+947.382535065" watchObservedRunningTime="2026-02-19 19:34:07.775558668 +0000 UTC m=+947.387908992" Feb 19 19:34:08 crc kubenswrapper[4722]: I0219 19:34:08.727276 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f25f2fe-8438-431d-9e9d-9efba0109efd" containerID="4d890c4dac125bb380d322709200de0a7be37d720848b6c18dd4ea38f5947388" exitCode=0 Feb 19 19:34:08 crc kubenswrapper[4722]: I0219 19:34:08.727328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerDied","Data":"4d890c4dac125bb380d322709200de0a7be37d720848b6c18dd4ea38f5947388"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742398 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"874b253e0bccb7f97ebd95fcb54c0e31eba020ca616e9fb59196bcf1ac283315"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"0b306ce5212c17fdbe1828b59c663f13d6b0eb82442bedd010f3f1d0ae151396"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"d4998c2193b90cda193d0b0cf7920519713faa546ce2f1032b76b9e2cbecef31"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"bd365dd3a658c77ec8005c17e8f91ba642703406a3c3f1c689db6b563245bb9c"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"7e40432bb8f30bfde889a12c17f0af54b3f76fbe90e89b03c3bdebb482bbfc64"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.979087 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.979168 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.039696 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.753889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"310864aff00bb3126a6253a731db09195b44e5e3c05266c4ffa908556dd4ec81"} Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.781996 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-92pkj" podStartSLOduration=6.2603589809999995 podStartE2EDuration="15.781975554s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="2026-02-19 19:33:55.98665845 +0000 UTC m=+935.599008774" lastFinishedPulling="2026-02-19 19:34:05.508275023 +0000 UTC m=+945.120625347" observedRunningTime="2026-02-19 19:34:10.773957154 +0000 UTC m=+950.386307498" watchObservedRunningTime="2026-02-19 19:34:10.781975554 +0000 UTC m=+950.394325898" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.825048 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.862529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:11 crc kubenswrapper[4722]: I0219 19:34:11.758991 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.659585 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-knsfg"] Feb 19 19:34:13 crc kubenswrapper[4722]: E0219 19:34:13.660106 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-content" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660134 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-content" Feb 19 19:34:13 crc kubenswrapper[4722]: E0219 19:34:13.660234 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-utilities" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660249 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-utilities" Feb 19 19:34:13 crc kubenswrapper[4722]: E0219 19:34:13.660270 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660283 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660480 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.661196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.664509 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.664777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jnrhw" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.664926 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.669384 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-knsfg"] Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.784252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmd4q\" (UniqueName: \"kubernetes.io/projected/efd426b6-a53d-4127-ae59-e2f9aec632cc-kube-api-access-nmd4q\") pod \"openstack-operator-index-knsfg\" (UID: \"efd426b6-a53d-4127-ae59-e2f9aec632cc\") " pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.885908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmd4q\" (UniqueName: \"kubernetes.io/projected/efd426b6-a53d-4127-ae59-e2f9aec632cc-kube-api-access-nmd4q\") pod \"openstack-operator-index-knsfg\" (UID: \"efd426b6-a53d-4127-ae59-e2f9aec632cc\") " pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.908985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmd4q\" (UniqueName: \"kubernetes.io/projected/efd426b6-a53d-4127-ae59-e2f9aec632cc-kube-api-access-nmd4q\") pod \"openstack-operator-index-knsfg\" (UID: \"efd426b6-a53d-4127-ae59-e2f9aec632cc\") " pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:14 crc kubenswrapper[4722]: I0219 19:34:14.021348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:14 crc kubenswrapper[4722]: I0219 19:34:14.493542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-knsfg"] Feb 19 19:34:14 crc kubenswrapper[4722]: I0219 19:34:14.784667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-knsfg" event={"ID":"efd426b6-a53d-4127-ae59-e2f9aec632cc","Type":"ContainerStarted","Data":"82ff5f971bb985c47c54ed6dd388bf0d32c67ddd65ee8d44e0802bf6a19497b9"} Feb 19 19:34:15 crc kubenswrapper[4722]: I0219 19:34:15.807482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:34:15 crc kubenswrapper[4722]: I0219 19:34:15.951875 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:34:17 crc kubenswrapper[4722]: I0219 19:34:17.823297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-knsfg" event={"ID":"efd426b6-a53d-4127-ae59-e2f9aec632cc","Type":"ContainerStarted","Data":"0fb9d395a665f45d2643f4879001487512fbda36d3d30d7d2a1031a259e9a39b"} Feb 19 19:34:17 crc kubenswrapper[4722]: I0219 19:34:17.854680 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-knsfg" podStartSLOduration=2.661166546 podStartE2EDuration="4.85465532s" podCreationTimestamp="2026-02-19 19:34:13 +0000 UTC" firstStartedPulling="2026-02-19 19:34:14.501136306 +0000 UTC m=+954.113486650" lastFinishedPulling="2026-02-19 19:34:16.6946251 +0000 UTC m=+956.306975424" observedRunningTime="2026-02-19 19:34:17.851061918 +0000 UTC m=+957.463412272" watchObservedRunningTime="2026-02-19 19:34:17.85465532 +0000 UTC m=+957.467005684" Feb 19 19:34:20 crc kubenswrapper[4722]: I0219 19:34:20.018579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:23 crc kubenswrapper[4722]: I0219 19:34:23.848142 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:23 crc kubenswrapper[4722]: I0219 19:34:23.851093 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7894" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" containerID="cri-o://be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" gracePeriod=2 Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.022249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.022670 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.054669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.249722 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.443399 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"97a52bce-2539-405e-867d-922857a2ce75\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.443546 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"97a52bce-2539-405e-867d-922857a2ce75\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.443607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"97a52bce-2539-405e-867d-922857a2ce75\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.444339 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities" (OuterVolumeSpecName: "utilities") pod "97a52bce-2539-405e-867d-922857a2ce75" (UID: "97a52bce-2539-405e-867d-922857a2ce75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.448962 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9" (OuterVolumeSpecName: "kube-api-access-jqsd9") pod "97a52bce-2539-405e-867d-922857a2ce75" (UID: "97a52bce-2539-405e-867d-922857a2ce75"). InnerVolumeSpecName "kube-api-access-jqsd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.491533 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97a52bce-2539-405e-867d-922857a2ce75" (UID: "97a52bce-2539-405e-867d-922857a2ce75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.545334 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.545386 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.545401 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.871661 4722 generic.go:334] "Generic (PLEG): container finished" podID="97a52bce-2539-405e-867d-922857a2ce75" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" exitCode=0 Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.872006 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.871865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06"} Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.872554 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"19f5d48196f47456f69c12a506337347dc985dee8f4eac6265950fed4107d051"} Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.872574 4722 scope.go:117] "RemoveContainer" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.891651 4722 scope.go:117] "RemoveContainer" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.910080 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.920165 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.925389 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.931477 4722 scope.go:117] "RemoveContainer" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.960325 4722 scope.go:117] "RemoveContainer" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" Feb 19 19:34:24 crc kubenswrapper[4722]: E0219 19:34:24.960743 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06\": container with ID starting with be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06 not found: ID does not exist" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.960787 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06"} err="failed to get container status \"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06\": rpc error: code = NotFound desc = could not find container \"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06\": container with ID starting with be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06 not found: ID does not exist" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.960818 4722 scope.go:117] "RemoveContainer" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" Feb 19 19:34:24 crc kubenswrapper[4722]: E0219 19:34:24.961182 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031\": container with ID starting with aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031 not found: ID does not exist" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.961221 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031"} err="failed to get container status \"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031\": rpc error: code = NotFound desc = could not find container \"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031\": container with ID starting with aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031 not found: ID does not exist" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.961246 4722 scope.go:117] "RemoveContainer" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" Feb 19 19:34:24 crc kubenswrapper[4722]: E0219 19:34:24.961594 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230\": container with ID starting with cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230 not found: ID does not exist" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.961624 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230"} err="failed to get container status \"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230\": rpc error: code = NotFound desc = could not find container \"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230\": container with ID starting with cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230 not found: ID does not exist" Feb 19 19:34:25 crc kubenswrapper[4722]: I0219 19:34:25.079440 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a52bce-2539-405e-867d-922857a2ce75" path="/var/lib/kubelet/pods/97a52bce-2539-405e-867d-922857a2ce75/volumes" Feb 19 19:34:25 crc kubenswrapper[4722]: I0219 19:34:25.828755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487126 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2"] Feb 19 19:34:26 crc kubenswrapper[4722]: E0219 19:34:26.487840 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-content" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487862 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-content" Feb 19 19:34:26 crc kubenswrapper[4722]: E0219 19:34:26.487905 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487920 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" Feb 19 19:34:26 crc kubenswrapper[4722]: E0219 19:34:26.487937 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-utilities" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487975 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-utilities" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.488229 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.489896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.491702 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gb525" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.501578 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2"] Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.571897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.571998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.572077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673390 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.674026 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.696736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.805362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.254284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2"] Feb 19 19:34:27 crc kubenswrapper[4722]: W0219 19:34:27.256171 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4260359d_1333_4ec5_9a57_16e2782fcf0f.slice/crio-83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a WatchSource:0}: Error finding container 83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a: Status 404 returned error can't find the container with id 83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.893186 4722 generic.go:334] "Generic (PLEG): container finished" podID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerID="ab7554108563399dcf6667cfbe885a7d27fad38216d3ea33bca15a54ecc72218" exitCode=0 Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.893291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"ab7554108563399dcf6667cfbe885a7d27fad38216d3ea33bca15a54ecc72218"} Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.893472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerStarted","Data":"83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a"} Feb 19 19:34:30 crc kubenswrapper[4722]: I0219 19:34:30.916514 4722 generic.go:334] "Generic (PLEG): container finished" podID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerID="0fff30a4a814644ff10c1e4f1b8d8a6bb84fe24437b4e7448f13179de5df38ed" exitCode=0 Feb 19 19:34:30 crc kubenswrapper[4722]: I0219 19:34:30.916579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"0fff30a4a814644ff10c1e4f1b8d8a6bb84fe24437b4e7448f13179de5df38ed"} Feb 19 19:34:31 crc kubenswrapper[4722]: I0219 19:34:31.925754 4722 generic.go:334] "Generic (PLEG): container finished" podID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerID="68a41a01ba9a1fef435bf6f5b6b832084a0ab857191fb46380ee752eaeebab7d" exitCode=0 Feb 19 19:34:31 crc kubenswrapper[4722]: I0219 19:34:31.925857 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"68a41a01ba9a1fef435bf6f5b6b832084a0ab857191fb46380ee752eaeebab7d"} Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.219130 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.277451 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"4260359d-1333-4ec5-9a57-16e2782fcf0f\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.277905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"4260359d-1333-4ec5-9a57-16e2782fcf0f\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.277975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"4260359d-1333-4ec5-9a57-16e2782fcf0f\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.279821 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle" (OuterVolumeSpecName: "bundle") pod "4260359d-1333-4ec5-9a57-16e2782fcf0f" (UID: "4260359d-1333-4ec5-9a57-16e2782fcf0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.285515 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b" (OuterVolumeSpecName: "kube-api-access-4v47b") pod "4260359d-1333-4ec5-9a57-16e2782fcf0f" (UID: "4260359d-1333-4ec5-9a57-16e2782fcf0f"). InnerVolumeSpecName "kube-api-access-4v47b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.291051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util" (OuterVolumeSpecName: "util") pod "4260359d-1333-4ec5-9a57-16e2782fcf0f" (UID: "4260359d-1333-4ec5-9a57-16e2782fcf0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.379357 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.379396 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.379405 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.939382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a"} Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.939418 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.939437 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.238787 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q"] Feb 19 19:34:38 crc kubenswrapper[4722]: E0219 19:34:38.239380 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="extract" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239394 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="extract" Feb 19 19:34:38 crc kubenswrapper[4722]: E0219 19:34:38.239408 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="util" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="util" Feb 19 19:34:38 crc kubenswrapper[4722]: E0219 19:34:38.239435 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="pull" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="pull" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239567 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="extract" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.240126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.242541 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-r8q9n" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.270940 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q"] Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.388186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmxc\" (UniqueName: \"kubernetes.io/projected/fb86a4c4-379d-4dcd-86c5-5ee95092e6c0-kube-api-access-gdmxc\") pod \"openstack-operator-controller-init-6ddf4746f6-l927q\" (UID: \"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0\") " pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.490111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmxc\" (UniqueName: \"kubernetes.io/projected/fb86a4c4-379d-4dcd-86c5-5ee95092e6c0-kube-api-access-gdmxc\") pod \"openstack-operator-controller-init-6ddf4746f6-l927q\" (UID: \"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0\") " pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.528233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmxc\" (UniqueName: \"kubernetes.io/projected/fb86a4c4-379d-4dcd-86c5-5ee95092e6c0-kube-api-access-gdmxc\") pod \"openstack-operator-controller-init-6ddf4746f6-l927q\" (UID: \"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0\") " pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.557066 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.796360 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q"] Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.801576 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:34:39 crc kubenswrapper[4722]: I0219 19:34:39.080366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" event={"ID":"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0","Type":"ContainerStarted","Data":"286a488e18fa657ccdd1b9a0961c73953796c8219c1d84e9f385a0d4d335dfc9"} Feb 19 19:34:41 crc kubenswrapper[4722]: I0219 19:34:41.799532 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:34:41 crc kubenswrapper[4722]: I0219 19:34:41.799900 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:34:44 crc kubenswrapper[4722]: I0219 19:34:44.115876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" event={"ID":"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0","Type":"ContainerStarted","Data":"040d704b01b3a5be919620b4ae2c981f0305775b07ff217791abe991d1521961"} Feb 19 19:34:44 crc kubenswrapper[4722]: I0219 19:34:44.116269 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:48 crc kubenswrapper[4722]: I0219 19:34:48.561037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:48 crc kubenswrapper[4722]: I0219 19:34:48.610663 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" podStartSLOduration=5.8653512249999995 podStartE2EDuration="10.610631043s" podCreationTimestamp="2026-02-19 19:34:38 +0000 UTC" firstStartedPulling="2026-02-19 19:34:38.801216158 +0000 UTC m=+978.413566482" lastFinishedPulling="2026-02-19 19:34:43.546495976 +0000 UTC m=+983.158846300" observedRunningTime="2026-02-19 19:34:44.144883358 +0000 UTC m=+983.757233682" watchObservedRunningTime="2026-02-19 19:34:48.610631043 +0000 UTC m=+988.222981427" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.006621 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.008450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.011046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-km9dx" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.011178 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.012042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.013664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qtmnj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.033693 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.042006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.065478 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.066510 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.075071 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.076422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.085316 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-m6mbq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.085484 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.086269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.086599 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zkwq2" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.096312 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-d9thq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.098181 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.115327 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.117386 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.119732 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.122523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-zhb7q" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.133137 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.163980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.182228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.183099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.188484 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.189138 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f2sz9" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.189273 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.190053 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.199518 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-wf2jp" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87z7d\" (UniqueName: \"kubernetes.io/projected/0af2e6ef-277d-4022-b42b-5639b589fef9-kube-api-access-87z7d\") pod \"barbican-operator-controller-manager-868647ff47-k5c54\" (UID: \"0af2e6ef-277d-4022-b42b-5639b589fef9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskjk\" (UniqueName: \"kubernetes.io/projected/b64009a1-83ef-4d66-bc6b-80ccfc6f7727-kube-api-access-sskjk\") pod \"cinder-operator-controller-manager-5d946d989d-x7bwr\" (UID: \"b64009a1-83ef-4d66-bc6b-80ccfc6f7727\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xxth\" (UniqueName: \"kubernetes.io/projected/edbe95e5-3a5d-4dec-9a94-509234857155-kube-api-access-2xxth\") pod \"designate-operator-controller-manager-6d8bf5c495-mc64t\" (UID: \"edbe95e5-3a5d-4dec-9a94-509234857155\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxm5\" (UniqueName: \"kubernetes.io/projected/baba09d1-2238-4ca1-98ee-f44938b68cd3-kube-api-access-4dxm5\") pod \"glance-operator-controller-manager-77987464f4-hxv5g\" (UID: \"baba09d1-2238-4ca1-98ee-f44938b68cd3\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxr5d\" (UniqueName: \"kubernetes.io/projected/019f7edd-1d9b-4069-a2a1-36bbe6b0a567-kube-api-access-lxr5d\") pod \"heat-operator-controller-manager-69f49c598c-qrsw8\" (UID: \"019f7edd-1d9b-4069-a2a1-36bbe6b0a567\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.238307 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.239490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.242494 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.243641 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.248652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.251662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zjmkn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.255077 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.262687 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b2d6t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.271441 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.279897 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.282294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.285831 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.291186 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.291907 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.298132 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8bmhz" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.298844 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tbls9" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.300510 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.312930 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.317370 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cj8np" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.329532 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhl9\" (UniqueName: \"kubernetes.io/projected/c36983b4-b7f9-4834-85e9-a5c3cb83eb2d-kube-api-access-xwhl9\") pod \"ironic-operator-controller-manager-554564d7fc-rnh9h\" (UID: \"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.329652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sskjk\" (UniqueName: \"kubernetes.io/projected/b64009a1-83ef-4d66-bc6b-80ccfc6f7727-kube-api-access-sskjk\") pod \"cinder-operator-controller-manager-5d946d989d-x7bwr\" (UID: \"b64009a1-83ef-4d66-bc6b-80ccfc6f7727\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.329697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xxth\" (UniqueName: \"kubernetes.io/projected/edbe95e5-3a5d-4dec-9a94-509234857155-kube-api-access-2xxth\") pod \"designate-operator-controller-manager-6d8bf5c495-mc64t\" (UID: \"edbe95e5-3a5d-4dec-9a94-509234857155\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5f69\" (UniqueName: \"kubernetes.io/projected/2c02c7e1-6f72-44be-a4fb-10ca1df420aa-kube-api-access-x5f69\") pod \"horizon-operator-controller-manager-5b9b8895d5-hncxm\" (UID: \"2c02c7e1-6f72-44be-a4fb-10ca1df420aa\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxm5\" (UniqueName: \"kubernetes.io/projected/baba09d1-2238-4ca1-98ee-f44938b68cd3-kube-api-access-4dxm5\") pod \"glance-operator-controller-manager-77987464f4-hxv5g\" (UID: \"baba09d1-2238-4ca1-98ee-f44938b68cd3\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xcp\" (UniqueName: \"kubernetes.io/projected/421f6539-4fcb-4949-ba29-34997fc98490-kube-api-access-w8xcp\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxr5d\" (UniqueName: \"kubernetes.io/projected/019f7edd-1d9b-4069-a2a1-36bbe6b0a567-kube-api-access-lxr5d\") pod \"heat-operator-controller-manager-69f49c598c-qrsw8\" (UID: \"019f7edd-1d9b-4069-a2a1-36bbe6b0a567\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.332082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87z7d\" (UniqueName: \"kubernetes.io/projected/0af2e6ef-277d-4022-b42b-5639b589fef9-kube-api-access-87z7d\") pod \"barbican-operator-controller-manager-868647ff47-k5c54\" (UID: \"0af2e6ef-277d-4022-b42b-5639b589fef9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.332220 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.348122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.370403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxr5d\" (UniqueName: \"kubernetes.io/projected/019f7edd-1d9b-4069-a2a1-36bbe6b0a567-kube-api-access-lxr5d\") pod \"heat-operator-controller-manager-69f49c598c-qrsw8\" (UID: \"019f7edd-1d9b-4069-a2a1-36bbe6b0a567\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.372776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxm5\" (UniqueName: \"kubernetes.io/projected/baba09d1-2238-4ca1-98ee-f44938b68cd3-kube-api-access-4dxm5\") pod \"glance-operator-controller-manager-77987464f4-hxv5g\" (UID: \"baba09d1-2238-4ca1-98ee-f44938b68cd3\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.373532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87z7d\" (UniqueName: \"kubernetes.io/projected/0af2e6ef-277d-4022-b42b-5639b589fef9-kube-api-access-87z7d\") pod \"barbican-operator-controller-manager-868647ff47-k5c54\" (UID: \"0af2e6ef-277d-4022-b42b-5639b589fef9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.375888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskjk\" (UniqueName: \"kubernetes.io/projected/b64009a1-83ef-4d66-bc6b-80ccfc6f7727-kube-api-access-sskjk\") pod \"cinder-operator-controller-manager-5d946d989d-x7bwr\" (UID: \"b64009a1-83ef-4d66-bc6b-80ccfc6f7727\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.381709 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.387850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xxth\" (UniqueName: \"kubernetes.io/projected/edbe95e5-3a5d-4dec-9a94-509234857155-kube-api-access-2xxth\") pod \"designate-operator-controller-manager-6d8bf5c495-mc64t\" (UID: \"edbe95e5-3a5d-4dec-9a94-509234857155\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.392610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.393874 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.395624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.401771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.404371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kdjrp" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.410021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.412426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5f69\" (UniqueName: \"kubernetes.io/projected/2c02c7e1-6f72-44be-a4fb-10ca1df420aa-kube-api-access-x5f69\") pod \"horizon-operator-controller-manager-5b9b8895d5-hncxm\" (UID: \"2c02c7e1-6f72-44be-a4fb-10ca1df420aa\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67z9c\" (UniqueName: \"kubernetes.io/projected/57783601-5230-49ef-8ac2-0ddf78bd4b3a-kube-api-access-67z9c\") pod \"neutron-operator-controller-manager-64ddbf8bb-6t7g6\" (UID: \"57783601-5230-49ef-8ac2-0ddf78bd4b3a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xcp\" (UniqueName: \"kubernetes.io/projected/421f6539-4fcb-4949-ba29-34997fc98490-kube-api-access-w8xcp\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlwlx\" (UniqueName: \"kubernetes.io/projected/64ff9a64-f79f-4a45-943d-36152964cfcd-kube-api-access-xlwlx\") pod \"nova-operator-controller-manager-567668f5cf-wqp5t\" (UID: \"64ff9a64-f79f-4a45-943d-36152964cfcd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434919 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwq5\" (UniqueName: \"kubernetes.io/projected/b37b04c7-5374-49d3-97c0-5b5b27c4a220-kube-api-access-zvwq5\") pod \"mariadb-operator-controller-manager-6994f66f48-8cljg\" (UID: \"b37b04c7-5374-49d3-97c0-5b5b27c4a220\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8gb\" (UniqueName: \"kubernetes.io/projected/766eebc1-05fc-4ca0-8c75-276632a6597e-kube-api-access-wd8gb\") pod \"manila-operator-controller-manager-54f6768c69-7qkx4\" (UID: \"766eebc1-05fc-4ca0-8c75-276632a6597e\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhl9\" (UniqueName: \"kubernetes.io/projected/c36983b4-b7f9-4834-85e9-a5c3cb83eb2d-kube-api-access-xwhl9\") pod \"ironic-operator-controller-manager-554564d7fc-rnh9h\" (UID: \"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.435012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjl6\" (UniqueName: \"kubernetes.io/projected/db329f91-74f2-4baa-ab5a-85ad999fc8ef-kube-api-access-zsjl6\") pod \"keystone-operator-controller-manager-b4d948c87-x6wk7\" (UID: \"db329f91-74f2-4baa-ab5a-85ad999fc8ef\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.435218 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.435695 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.435759 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:08.935737771 +0000 UTC m=+1008.548088135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.442946 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.444713 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.448215 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fcjp9" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.448397 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.457470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5f69\" (UniqueName: \"kubernetes.io/projected/2c02c7e1-6f72-44be-a4fb-10ca1df420aa-kube-api-access-x5f69\") pod \"horizon-operator-controller-manager-5b9b8895d5-hncxm\" (UID: \"2c02c7e1-6f72-44be-a4fb-10ca1df420aa\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.459714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhl9\" (UniqueName: \"kubernetes.io/projected/c36983b4-b7f9-4834-85e9-a5c3cb83eb2d-kube-api-access-xwhl9\") pod \"ironic-operator-controller-manager-554564d7fc-rnh9h\" (UID: \"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.466413 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.467326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.468568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xcp\" (UniqueName: \"kubernetes.io/projected/421f6539-4fcb-4949-ba29-34997fc98490-kube-api-access-w8xcp\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.469491 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.472419 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tmczn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.477440 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.481236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.483416 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tgxpl" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.485301 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-wktqn"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.494165 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.495984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.498096 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fbsdz" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.509031 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-wktqn"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.518550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.536329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjl6\" (UniqueName: \"kubernetes.io/projected/db329f91-74f2-4baa-ab5a-85ad999fc8ef-kube-api-access-zsjl6\") pod \"keystone-operator-controller-manager-b4d948c87-x6wk7\" (UID: \"db329f91-74f2-4baa-ab5a-85ad999fc8ef\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.536667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf42n\" (UniqueName: \"kubernetes.io/projected/a6fb3554-24ea-4330-b2cb-1c91f105345d-kube-api-access-jf42n\") pod \"octavia-operator-controller-manager-69f8888797-zft4s\" (UID: \"a6fb3554-24ea-4330-b2cb-1c91f105345d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.536691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67z9c\" (UniqueName: \"kubernetes.io/projected/57783601-5230-49ef-8ac2-0ddf78bd4b3a-kube-api-access-67z9c\") pod \"neutron-operator-controller-manager-64ddbf8bb-6t7g6\" (UID: \"57783601-5230-49ef-8ac2-0ddf78bd4b3a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.542292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlwlx\" (UniqueName: \"kubernetes.io/projected/64ff9a64-f79f-4a45-943d-36152964cfcd-kube-api-access-xlwlx\") pod \"nova-operator-controller-manager-567668f5cf-wqp5t\" (UID: \"64ff9a64-f79f-4a45-943d-36152964cfcd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.542357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwq5\" (UniqueName: \"kubernetes.io/projected/b37b04c7-5374-49d3-97c0-5b5b27c4a220-kube-api-access-zvwq5\") pod \"mariadb-operator-controller-manager-6994f66f48-8cljg\" (UID: \"b37b04c7-5374-49d3-97c0-5b5b27c4a220\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.542385 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8gb\" (UniqueName: \"kubernetes.io/projected/766eebc1-05fc-4ca0-8c75-276632a6597e-kube-api-access-wd8gb\") pod \"manila-operator-controller-manager-54f6768c69-7qkx4\" (UID: \"766eebc1-05fc-4ca0-8c75-276632a6597e\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.551126 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.557011 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67z9c\" (UniqueName: \"kubernetes.io/projected/57783601-5230-49ef-8ac2-0ddf78bd4b3a-kube-api-access-67z9c\") pod \"neutron-operator-controller-manager-64ddbf8bb-6t7g6\" (UID: \"57783601-5230-49ef-8ac2-0ddf78bd4b3a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.559700 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.560537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.562631 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hvpw7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.568267 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjl6\" (UniqueName: \"kubernetes.io/projected/db329f91-74f2-4baa-ab5a-85ad999fc8ef-kube-api-access-zsjl6\") pod \"keystone-operator-controller-manager-b4d948c87-x6wk7\" (UID: \"db329f91-74f2-4baa-ab5a-85ad999fc8ef\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.569294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwq5\" (UniqueName: \"kubernetes.io/projected/b37b04c7-5374-49d3-97c0-5b5b27c4a220-kube-api-access-zvwq5\") pod \"mariadb-operator-controller-manager-6994f66f48-8cljg\" (UID: \"b37b04c7-5374-49d3-97c0-5b5b27c4a220\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.570327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8gb\" (UniqueName: \"kubernetes.io/projected/766eebc1-05fc-4ca0-8c75-276632a6597e-kube-api-access-wd8gb\") pod \"manila-operator-controller-manager-54f6768c69-7qkx4\" (UID: \"766eebc1-05fc-4ca0-8c75-276632a6597e\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.576185 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlwlx\" (UniqueName: \"kubernetes.io/projected/64ff9a64-f79f-4a45-943d-36152964cfcd-kube-api-access-xlwlx\") pod \"nova-operator-controller-manager-567668f5cf-wqp5t\" (UID: \"64ff9a64-f79f-4a45-943d-36152964cfcd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.577759 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.593126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.619181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.629562 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.640197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.643337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-dbdmf"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg4n\" (UniqueName: \"kubernetes.io/projected/820eede6-6396-4466-bf00-5d3b39d982d6-kube-api-access-cwg4n\") pod \"placement-operator-controller-manager-8497b45c89-mgzgq\" (UID: \"820eede6-6396-4466-bf00-5d3b39d982d6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf42n\" (UniqueName: \"kubernetes.io/projected/a6fb3554-24ea-4330-b2cb-1c91f105345d-kube-api-access-jf42n\") pod \"octavia-operator-controller-manager-69f8888797-zft4s\" (UID: \"a6fb3554-24ea-4330-b2cb-1c91f105345d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644251 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmjb\" (UniqueName: \"kubernetes.io/projected/29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8-kube-api-access-7lmjb\") pod \"swift-operator-controller-manager-68f46476f-wktqn\" (UID: \"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64jz\" (UniqueName: \"kubernetes.io/projected/8870a7b1-f894-4429-9f52-d9063fe9c780-kube-api-access-s64jz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6njp\" (UniqueName: \"kubernetes.io/projected/738a1346-88e9-4c4e-b7ce-1878736e2493-kube-api-access-h6njp\") pod \"ovn-operator-controller-manager-d44cf6b75-6dlqc\" (UID: \"738a1346-88e9-4c4e-b7ce-1878736e2493\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.646524 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vl4wv" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.653267 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-dbdmf"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.659940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.677840 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.679532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf42n\" (UniqueName: \"kubernetes.io/projected/a6fb3554-24ea-4330-b2cb-1c91f105345d-kube-api-access-jf42n\") pod \"octavia-operator-controller-manager-69f8888797-zft4s\" (UID: \"a6fb3554-24ea-4330-b2cb-1c91f105345d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.679618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.682040 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gbphg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.684399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6njp\" (UniqueName: \"kubernetes.io/projected/738a1346-88e9-4c4e-b7ce-1878736e2493-kube-api-access-h6njp\") pod \"ovn-operator-controller-manager-d44cf6b75-6dlqc\" (UID: \"738a1346-88e9-4c4e-b7ce-1878736e2493\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg4n\" (UniqueName: \"kubernetes.io/projected/820eede6-6396-4466-bf00-5d3b39d982d6-kube-api-access-cwg4n\") pod \"placement-operator-controller-manager-8497b45c89-mgzgq\" (UID: \"820eede6-6396-4466-bf00-5d3b39d982d6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skxh\" (UniqueName: \"kubernetes.io/projected/792a7a0a-a11e-42ce-a99b-e24127e7bbe8-kube-api-access-6skxh\") pod \"telemetry-operator-controller-manager-5484b6858b-7g48c\" (UID: \"792a7a0a-a11e-42ce-a99b-e24127e7bbe8\") " pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqf2\" (UniqueName: \"kubernetes.io/projected/2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac-kube-api-access-6nqf2\") pod \"test-operator-controller-manager-7866795846-dbdmf\" (UID: \"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac\") " pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmjb\" (UniqueName: \"kubernetes.io/projected/29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8-kube-api-access-7lmjb\") pod \"swift-operator-controller-manager-68f46476f-wktqn\" (UID: \"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.749010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64jz\" (UniqueName: \"kubernetes.io/projected/8870a7b1-f894-4429-9f52-d9063fe9c780-kube-api-access-s64jz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.751904 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.751988 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.251948029 +0000 UTC m=+1008.864298353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.753114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.761450 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.762316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.768703 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.768939 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4zvsr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.769666 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.792625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6njp\" (UniqueName: \"kubernetes.io/projected/738a1346-88e9-4c4e-b7ce-1878736e2493-kube-api-access-h6njp\") pod \"ovn-operator-controller-manager-d44cf6b75-6dlqc\" (UID: \"738a1346-88e9-4c4e-b7ce-1878736e2493\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.793496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64jz\" (UniqueName: \"kubernetes.io/projected/8870a7b1-f894-4429-9f52-d9063fe9c780-kube-api-access-s64jz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.797269 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg4n\" (UniqueName: \"kubernetes.io/projected/820eede6-6396-4466-bf00-5d3b39d982d6-kube-api-access-cwg4n\") pod \"placement-operator-controller-manager-8497b45c89-mgzgq\" (UID: \"820eede6-6396-4466-bf00-5d3b39d982d6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.797498 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.797738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.799296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmjb\" (UniqueName: \"kubernetes.io/projected/29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8-kube-api-access-7lmjb\") pod \"swift-operator-controller-manager-68f46476f-wktqn\" (UID: \"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.815167 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrft7\" (UniqueName: \"kubernetes.io/projected/f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0-kube-api-access-nrft7\") pod \"watcher-operator-controller-manager-5db88f68c-zdfxj\" (UID: \"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skxh\" (UniqueName: \"kubernetes.io/projected/792a7a0a-a11e-42ce-a99b-e24127e7bbe8-kube-api-access-6skxh\") pod \"telemetry-operator-controller-manager-5484b6858b-7g48c\" (UID: \"792a7a0a-a11e-42ce-a99b-e24127e7bbe8\") " pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhqdp\" (UniqueName: \"kubernetes.io/projected/12f061e0-51af-4ab9-a8a7-26b2775651e1-kube-api-access-rhqdp\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850647 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqf2\" (UniqueName: \"kubernetes.io/projected/2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac-kube-api-access-6nqf2\") pod \"test-operator-controller-manager-7866795846-dbdmf\" (UID: \"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac\") " pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850666 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.855009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.864287 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.865864 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.866315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.867812 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-52tp7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.871807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skxh\" (UniqueName: \"kubernetes.io/projected/792a7a0a-a11e-42ce-a99b-e24127e7bbe8-kube-api-access-6skxh\") pod \"telemetry-operator-controller-manager-5484b6858b-7g48c\" (UID: \"792a7a0a-a11e-42ce-a99b-e24127e7bbe8\") " pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.872142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqf2\" (UniqueName: \"kubernetes.io/projected/2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac-kube-api-access-6nqf2\") pod \"test-operator-controller-manager-7866795846-dbdmf\" (UID: \"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac\") " pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.906891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.952931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrft7\" (UniqueName: \"kubernetes.io/projected/f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0-kube-api-access-nrft7\") pod \"watcher-operator-controller-manager-5db88f68c-zdfxj\" (UID: \"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.952999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhqdp\" (UniqueName: \"kubernetes.io/projected/12f061e0-51af-4ab9-a8a7-26b2775651e1-kube-api-access-rhqdp\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdzq\" (UniqueName: \"kubernetes.io/projected/65b17979-6c94-40e6-ac54-41a61a726e87-kube-api-access-sfdzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjv7d\" (UID: \"65b17979-6c94-40e6-ac54-41a61a726e87\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.953379 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.953438 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.953418959 +0000 UTC m=+1009.565769283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954219 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954253 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.454242614 +0000 UTC m=+1009.066592938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954344 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954438 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.4544196 +0000 UTC m=+1009.066769924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.991926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrft7\" (UniqueName: \"kubernetes.io/projected/f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0-kube-api-access-nrft7\") pod \"watcher-operator-controller-manager-5db88f68c-zdfxj\" (UID: \"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.994089 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.026638 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.037003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhqdp\" (UniqueName: \"kubernetes.io/projected/12f061e0-51af-4ab9-a8a7-26b2775651e1-kube-api-access-rhqdp\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.059773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdzq\" (UniqueName: \"kubernetes.io/projected/65b17979-6c94-40e6-ac54-41a61a726e87-kube-api-access-sfdzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjv7d\" (UID: \"65b17979-6c94-40e6-ac54-41a61a726e87\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.069490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.098999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.104551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdzq\" (UniqueName: \"kubernetes.io/projected/65b17979-6c94-40e6-ac54-41a61a726e87-kube-api-access-sfdzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjv7d\" (UID: \"65b17979-6c94-40e6-ac54-41a61a726e87\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.136826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.147974 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.168382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.208397 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.262122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.262360 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.262407 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:10.26239153 +0000 UTC m=+1009.874741854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.291543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" event={"ID":"edbe95e5-3a5d-4dec-9a94-509234857155","Type":"ContainerStarted","Data":"737a9653409a2ed747e71d9e9d0ff2d724ec46e51dcb4a7f4f3a696444b09f80"} Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.342776 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.397924 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.464922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.465063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465251 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465312 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:10.465292055 +0000 UTC m=+1010.077642379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465314 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465381 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:10.465362287 +0000 UTC m=+1010.077712611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.753449 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.765842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.783984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.915567 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6"] Feb 19 19:35:09 crc kubenswrapper[4722]: W0219 19:35:09.919836 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57783601_5230_49ef_8ac2_0ddf78bd4b3a.slice/crio-6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b WatchSource:0}: Error finding container 6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b: Status 404 returned error can't find the container with id 6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.976398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.976550 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.976615 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:11.976599963 +0000 UTC m=+1011.588950277 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.118048 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4"] Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.125339 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37b04c7_5374_49d3_97c0_5b5b27c4a220.slice/crio-defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563 WatchSource:0}: Error finding container defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563: Status 404 returned error can't find the container with id defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563 Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.125556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg"] Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.126279 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b17979_6c94_40e6_ac54_41a61a726e87.slice/crio-9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9 WatchSource:0}: Error finding container 9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9: Status 404 returned error can't find the container with id 9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9 Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.131341 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.135657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.140030 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.160294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.169923 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj"] Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.183474 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6fb3554_24ea_4330_b2cb_1c91f105345d.slice/crio-8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207 WatchSource:0}: Error finding container 8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207: Status 404 returned error can't find the container with id 8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207 Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.201714 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x5f69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-hncxm_openstack-operators(2c02c7e1-6f72-44be-a4fb-10ca1df420aa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.209958 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podUID="2c02c7e1-6f72-44be-a4fb-10ca1df420aa" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.281101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.281347 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.281408 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:12.281389464 +0000 UTC m=+1011.893739788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.309523 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-dbdmf"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.312800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" event={"ID":"2c02c7e1-6f72-44be-a4fb-10ca1df420aa","Type":"ContainerStarted","Data":"88a0d923d1f4f45462f59caff291c2d46d7aec53ff989922ac41ce29b90d0431"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.313705 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podUID="2c02c7e1-6f72-44be-a4fb-10ca1df420aa" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.313952 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.314956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" event={"ID":"a6fb3554-24ea-4330-b2cb-1c91f105345d","Type":"ContainerStarted","Data":"8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207"} Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.322331 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ff9a64_f79f_4a45_943d_36152964cfcd.slice/crio-724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82 WatchSource:0}: Error finding container 724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82: Status 404 returned error can't find the container with id 724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82 Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.322662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" event={"ID":"57783601-5230-49ef-8ac2-0ddf78bd4b3a","Type":"ContainerStarted","Data":"6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.324322 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" event={"ID":"820eede6-6396-4466-bf00-5d3b39d982d6","Type":"ContainerStarted","Data":"17fb740d71db5c3594c816c5ccaa09c5afbf67be88fdd88e1ac550ff74c2deb6"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.334227 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-wktqn"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.336931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" event={"ID":"65b17979-6c94-40e6-ac54-41a61a726e87","Type":"ContainerStarted","Data":"9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.338411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" event={"ID":"766eebc1-05fc-4ca0-8c75-276632a6597e","Type":"ContainerStarted","Data":"b1b2d484e355b55c092d25a3a139271cf0cd78ecc1c8de758660f56a7e1fd34b"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.343486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" event={"ID":"db329f91-74f2-4baa-ab5a-85ad999fc8ef","Type":"ContainerStarted","Data":"80c6af299b43c30d0772a91d8590b548a277a17a3b240bf9f9a325e6e31ae74e"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.343957 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nqf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-dbdmf_openstack-operators(2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.346419 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podUID="2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.347450 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" event={"ID":"b64009a1-83ef-4d66-bc6b-80ccfc6f7727","Type":"ContainerStarted","Data":"e63c5aac3215063983439da9853bc504afc8d16b9b380b482608e7e4b3e0f990"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.357588 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c"] Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.358432 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:b83e41b73854b29b5d6860a1d9e9ac7611640781,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6skxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5484b6858b-7g48c_openstack-operators(792a7a0a-a11e-42ce-a99b-e24127e7bbe8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.359040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" event={"ID":"b37b04c7-5374-49d3-97c0-5b5b27c4a220","Type":"ContainerStarted","Data":"defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.359862 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podUID="792a7a0a-a11e-42ce-a99b-e24127e7bbe8" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.361832 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" event={"ID":"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d","Type":"ContainerStarted","Data":"ffc3d12e367cebd1cfb82440cd3115cefabcea3a1340171d9ec9798cc1d9e90e"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.363354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" event={"ID":"019f7edd-1d9b-4069-a2a1-36bbe6b0a567","Type":"ContainerStarted","Data":"13c6429d8da153184ee95dcd6ec0c902b204ffb2db95645138a02a88428298c4"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.364047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" event={"ID":"baba09d1-2238-4ca1-98ee-f44938b68cd3","Type":"ContainerStarted","Data":"2149a0615d3124e22a80a9aeb9434ab419be0511f46fc8bde6386401cf205fab"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.387700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" event={"ID":"0af2e6ef-277d-4022-b42b-5639b589fef9","Type":"ContainerStarted","Data":"30c7d8cc11849ef878a5bbae8ba84f80de9816ad9cc6b57efc3acf91422921bb"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.388770 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7lmjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-wktqn_openstack-operators(29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.390277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" event={"ID":"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0","Type":"ContainerStarted","Data":"c7fc3e072fdb34adcb59876dc21c99e36b93d371565d45a283a079b2d908006d"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.390305 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podUID="29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.392449 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6njp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-6dlqc_openstack-operators(738a1346-88e9-4c4e-b7ce-1878736e2493): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.393782 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podUID="738a1346-88e9-4c4e-b7ce-1878736e2493" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.444723 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.484225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484414 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484485 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:12.484468384 +0000 UTC m=+1012.096818708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484521 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484573 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:12.484559137 +0000 UTC m=+1012.096909451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.484419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.407755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" event={"ID":"64ff9a64-f79f-4a45-943d-36152964cfcd","Type":"ContainerStarted","Data":"724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82"} Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.410225 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" event={"ID":"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8","Type":"ContainerStarted","Data":"edc8f542c0b4375ecca22ccf3c0080c11524922334b8014465b5b2037358f125"} Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.411483 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" event={"ID":"738a1346-88e9-4c4e-b7ce-1878736e2493","Type":"ContainerStarted","Data":"c57de2de3f17acfe662d63880d81fd9cd3117e55a8eb320b063c6decf3af5b86"} Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.412373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podUID="29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8" Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.413371 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podUID="738a1346-88e9-4c4e-b7ce-1878736e2493" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.414735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" event={"ID":"792a7a0a-a11e-42ce-a99b-e24127e7bbe8","Type":"ContainerStarted","Data":"8b7ef117b102669f6e7ea2cb36f9491a15af28ec5bf3aaac6435515beb9d51bd"} Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.416495 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:b83e41b73854b29b5d6860a1d9e9ac7611640781\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podUID="792a7a0a-a11e-42ce-a99b-e24127e7bbe8" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.423838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" event={"ID":"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac","Type":"ContainerStarted","Data":"d80c15863f489995e056ec96bf2ea5143b89334f5bceadf92e59c7b37cbc8120"} Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.430884 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podUID="2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac" Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.433171 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podUID="2c02c7e1-6f72-44be-a4fb-10ca1df420aa" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.798849 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.798929 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.034971 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.035182 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.035226 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.035213063 +0000 UTC m=+1015.647563377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.349970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.350172 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.350263 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.350239063 +0000 UTC m=+1015.962589427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.435597 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podUID="2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.435695 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podUID="738a1346-88e9-4c4e-b7ce-1878736e2493" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.438680 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podUID="29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.446338 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:b83e41b73854b29b5d6860a1d9e9ac7611640781\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podUID="792a7a0a-a11e-42ce-a99b-e24127e7bbe8" Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.553700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.553843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.553947 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.554019 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.554061 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.554038136 +0000 UTC m=+1016.166388460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.554091 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.554071057 +0000 UTC m=+1016.166421371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.111255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.111439 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.111791 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.111772487 +0000 UTC m=+1023.724122811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.416306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.416492 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.416605 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.416579308 +0000 UTC m=+1024.028929662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.618944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.619019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619192 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619244 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.619231245 +0000 UTC m=+1024.231581569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619253 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619396 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.619375049 +0000 UTC m=+1024.231725433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.001444 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.002095 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-67z9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-6t7g6_openstack-operators(57783601-5230-49ef-8ac2-0ddf78bd4b3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.003965 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" podUID="57783601-5230-49ef-8ac2-0ddf78bd4b3a" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.491968 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.492191 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvwq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-8cljg_openstack-operators(b37b04c7-5374-49d3-97c0-5b5b27c4a220): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.493712 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" podUID="b37b04c7-5374-49d3-97c0-5b5b27c4a220" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.520298 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" podUID="b37b04c7-5374-49d3-97c0-5b5b27c4a220" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.520635 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" podUID="57783601-5230-49ef-8ac2-0ddf78bd4b3a" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.137539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.147883 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.257452 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.257702 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jf42n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-zft4s_openstack-operators(a6fb3554-24ea-4330-b2cb-1c91f105345d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.258976 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" podUID="a6fb3554-24ea-4330-b2cb-1c91f105345d" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.410391 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f2sz9" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.418056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.441363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.441542 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.441624 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:40.441603531 +0000 UTC m=+1040.053953855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.527283 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" podUID="a6fb3554-24ea-4330-b2cb-1c91f105345d" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.644996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.645132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.650710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.668081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.775245 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4zvsr" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.783809 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.383920 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.384654 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrft7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-zdfxj_openstack-operators(f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.386295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" podUID="f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.539626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" podUID="f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.952144 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.952525 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xwhl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-rnh9h_openstack-operators(c36983b4-b7f9-4834-85e9-a5c3cb83eb2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.953849 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" podUID="c36983b4-b7f9-4834-85e9-a5c3cb83eb2d" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.547922 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" podUID="c36983b4-b7f9-4834-85e9-a5c3cb83eb2d" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.748998 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.749180 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xlwlx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-wqp5t_openstack-operators(64ff9a64-f79f-4a45-943d-36152964cfcd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.750437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" podUID="64ff9a64-f79f-4a45-943d-36152964cfcd" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.554522 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" podUID="64ff9a64-f79f-4a45-943d-36152964cfcd" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.840208 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.840736 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wd8gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-7qkx4_openstack-operators(766eebc1-05fc-4ca0-8c75-276632a6597e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.843195 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" podUID="766eebc1-05fc-4ca0-8c75-276632a6597e" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.303696 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.303846 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfdzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pjv7d_openstack-operators(65b17979-6c94-40e6-ac54-41a61a726e87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.305122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" podUID="65b17979-6c94-40e6-ac54-41a61a726e87" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.560836 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" podUID="766eebc1-05fc-4ca0-8c75-276632a6597e" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.561317 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" podUID="65b17979-6c94-40e6-ac54-41a61a726e87" Feb 19 19:35:37 crc kubenswrapper[4722]: I0219 19:35:37.873173 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj"] Feb 19 19:35:37 crc kubenswrapper[4722]: I0219 19:35:37.926352 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj"] Feb 19 19:35:37 crc kubenswrapper[4722]: W0219 19:35:37.947771 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f061e0_51af_4ab9_a8a7_26b2775651e1.slice/crio-04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4 WatchSource:0}: Error finding container 04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4: Status 404 returned error can't find the container with id 04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4 Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.648689 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" event={"ID":"edbe95e5-3a5d-4dec-9a94-509234857155","Type":"ContainerStarted","Data":"4487cf700dbcf431a1d68b2c6ea8a0076fe01544c2a6a6ae5f033cbb2d122abc"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.649580 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.662831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" event={"ID":"db329f91-74f2-4baa-ab5a-85ad999fc8ef","Type":"ContainerStarted","Data":"4472e0683e8b574a01c4e15d88578e8633e04850588202934771e6a2438d064b"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.663018 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.680594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" event={"ID":"738a1346-88e9-4c4e-b7ce-1878736e2493","Type":"ContainerStarted","Data":"c5a84a777980c9b91618fc57728e85cfb33c312ce3f62a262306c9e4e0345dd9"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.681330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.694471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" event={"ID":"0af2e6ef-277d-4022-b42b-5639b589fef9","Type":"ContainerStarted","Data":"1ac8c6f3f1f9fb0a9846964c89b7023c08c4efde23f1eae391e56ffa766266e5"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.695712 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.701007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" event={"ID":"019f7edd-1d9b-4069-a2a1-36bbe6b0a567","Type":"ContainerStarted","Data":"ee31418e17898155a985a1968fecb7f9d9603894a860b0011d6cef67fe77d272"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.707256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.720794 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" event={"ID":"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac","Type":"ContainerStarted","Data":"715825cd8635f30e5ca4361acce74a04e2b838bed261bf1eec5e2bd2a7bfe4a6"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.734246 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.734712 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" podStartSLOduration=10.685434639 podStartE2EDuration="30.734703318s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.245433561 +0000 UTC m=+1008.857783885" lastFinishedPulling="2026-02-19 19:35:29.29470224 +0000 UTC m=+1028.907052564" observedRunningTime="2026-02-19 19:35:38.734003366 +0000 UTC m=+1038.346353690" watchObservedRunningTime="2026-02-19 19:35:38.734703318 +0000 UTC m=+1038.347053642" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.746031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" event={"ID":"2c02c7e1-6f72-44be-a4fb-10ca1df420aa","Type":"ContainerStarted","Data":"3d535cda1b0c8c51f72486f4353c197b016a10cbac2eedf9e290116c8c69c370"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.747015 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.787053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" event={"ID":"12f061e0-51af-4ab9-a8a7-26b2775651e1","Type":"ContainerStarted","Data":"06677fea9cfd054dd44ccb373c1289a9ec88d0e4622ac4f8ecc38a445986540a"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.787095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" event={"ID":"12f061e0-51af-4ab9-a8a7-26b2775651e1","Type":"ContainerStarted","Data":"04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.787656 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.837143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" event={"ID":"b64009a1-83ef-4d66-bc6b-80ccfc6f7727","Type":"ContainerStarted","Data":"80f93f3f8813f9698f7662a057f164a7481334cf04f7871173fb2dc5ec4ad8a7"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.837206 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.847229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podStartSLOduration=3.691088167 podStartE2EDuration="30.847213716s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.390189725 +0000 UTC m=+1010.002540049" lastFinishedPulling="2026-02-19 19:35:37.546315264 +0000 UTC m=+1037.158665598" observedRunningTime="2026-02-19 19:35:38.789440624 +0000 UTC m=+1038.401790948" watchObservedRunningTime="2026-02-19 19:35:38.847213716 +0000 UTC m=+1038.459564040" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.847913 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" podStartSLOduration=10.857382340000001 podStartE2EDuration="30.847909037s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.292809808 +0000 UTC m=+1008.905160132" lastFinishedPulling="2026-02-19 19:35:29.283336505 +0000 UTC m=+1028.895686829" observedRunningTime="2026-02-19 19:35:38.842666003 +0000 UTC m=+1038.455016337" watchObservedRunningTime="2026-02-19 19:35:38.847909037 +0000 UTC m=+1038.460259361" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.853971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" event={"ID":"421f6539-4fcb-4949-ba29-34997fc98490","Type":"ContainerStarted","Data":"879a20f9d3ebd9c5640fe2cc700408edc14d4f73100e04e6e46aba7e09b03919"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.864627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" event={"ID":"792a7a0a-a11e-42ce-a99b-e24127e7bbe8","Type":"ContainerStarted","Data":"895f73332c259f575167ccd7c2f4c99c0df02eeed6dfa15598e840c070aee417"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.865224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.885435 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" podStartSLOduration=12.375692386 podStartE2EDuration="31.885417436s" podCreationTimestamp="2026-02-19 19:35:07 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.784944509 +0000 UTC m=+1009.397294833" lastFinishedPulling="2026-02-19 19:35:29.294669559 +0000 UTC m=+1028.907019883" observedRunningTime="2026-02-19 19:35:38.883556089 +0000 UTC m=+1038.495906413" watchObservedRunningTime="2026-02-19 19:35:38.885417436 +0000 UTC m=+1038.497767760" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.892370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" event={"ID":"baba09d1-2238-4ca1-98ee-f44938b68cd3","Type":"ContainerStarted","Data":"a1fb15869401aae08d17f4bf12a1969f0cc4c8c77d9e47b782efab7d76fc54bf"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.892876 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.931863 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podStartSLOduration=3.718078847 podStartE2EDuration="30.931849863s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.343850931 +0000 UTC m=+1009.956201245" lastFinishedPulling="2026-02-19 19:35:37.557621937 +0000 UTC m=+1037.169972261" observedRunningTime="2026-02-19 19:35:38.929393427 +0000 UTC m=+1038.541743751" watchObservedRunningTime="2026-02-19 19:35:38.931849863 +0000 UTC m=+1038.544200187" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.950385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" event={"ID":"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8","Type":"ContainerStarted","Data":"7358b3ea5bc9182b4c1e8bec442955fc6746af482a0c956a6c321ee2d9a44602"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.951067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.955809 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" podStartSLOduration=5.607891316 podStartE2EDuration="30.95579352s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.758325799 +0000 UTC m=+1009.370676123" lastFinishedPulling="2026-02-19 19:35:35.106227963 +0000 UTC m=+1034.718578327" observedRunningTime="2026-02-19 19:35:38.955761939 +0000 UTC m=+1038.568112263" watchObservedRunningTime="2026-02-19 19:35:38.95579352 +0000 UTC m=+1038.568143844" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.964836 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" event={"ID":"820eede6-6396-4466-bf00-5d3b39d982d6","Type":"ContainerStarted","Data":"26a3364052ca45767994b680e3619d00780c066d9c7e579ff0b47e2405a21d62"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.965551 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.045136 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" podStartSLOduration=31.045115074 podStartE2EDuration="31.045115074s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:35:39.043784472 +0000 UTC m=+1038.656134796" watchObservedRunningTime="2026-02-19 19:35:39.045115074 +0000 UTC m=+1038.657465398" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.082054 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" podStartSLOduration=11.153746498 podStartE2EDuration="31.082027345s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.366313989 +0000 UTC m=+1008.978664313" lastFinishedPulling="2026-02-19 19:35:29.294594836 +0000 UTC m=+1028.906945160" observedRunningTime="2026-02-19 19:35:39.07995449 +0000 UTC m=+1038.692304814" watchObservedRunningTime="2026-02-19 19:35:39.082027345 +0000 UTC m=+1038.694377669" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.138033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podStartSLOduration=3.935187835 podStartE2EDuration="31.13801222s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.388641647 +0000 UTC m=+1010.000991971" lastFinishedPulling="2026-02-19 19:35:37.591466032 +0000 UTC m=+1037.203816356" observedRunningTime="2026-02-19 19:35:39.130189616 +0000 UTC m=+1038.742539940" watchObservedRunningTime="2026-02-19 19:35:39.13801222 +0000 UTC m=+1038.750362544" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.139203 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podStartSLOduration=3.869543 podStartE2EDuration="31.139196807s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.358276131 +0000 UTC m=+1009.970626455" lastFinishedPulling="2026-02-19 19:35:37.627929938 +0000 UTC m=+1037.240280262" observedRunningTime="2026-02-19 19:35:39.104543357 +0000 UTC m=+1038.716893681" watchObservedRunningTime="2026-02-19 19:35:39.139196807 +0000 UTC m=+1038.751547141" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.153856 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podStartSLOduration=5.048998655 podStartE2EDuration="31.153838143s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.201595797 +0000 UTC m=+1009.813946121" lastFinishedPulling="2026-02-19 19:35:36.306435245 +0000 UTC m=+1035.918785609" observedRunningTime="2026-02-19 19:35:39.153588795 +0000 UTC m=+1038.765939119" watchObservedRunningTime="2026-02-19 19:35:39.153838143 +0000 UTC m=+1038.766188467" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.184145 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" podStartSLOduration=12.6908621 podStartE2EDuration="32.184129237s" podCreationTimestamp="2026-02-19 19:35:07 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.801373851 +0000 UTC m=+1009.413724175" lastFinishedPulling="2026-02-19 19:35:29.294640968 +0000 UTC m=+1028.906991312" observedRunningTime="2026-02-19 19:35:39.179811183 +0000 UTC m=+1038.792161507" watchObservedRunningTime="2026-02-19 19:35:39.184129237 +0000 UTC m=+1038.796479561" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.976658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" event={"ID":"b37b04c7-5374-49d3-97c0-5b5b27c4a220","Type":"ContainerStarted","Data":"eace4536b524ad6edbe4682a42e6e2710ecc4fddd0b7f739d80ca9b434c3e2af"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.976966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.977826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" event={"ID":"a6fb3554-24ea-4330-b2cb-1c91f105345d","Type":"ContainerStarted","Data":"40c844911fae397675c6c86544f97ad54db860a1ebf4432da0f6b43be2fc9a61"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.978201 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.979776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" event={"ID":"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0","Type":"ContainerStarted","Data":"a23bb12790be74459bed2f278406f1f1090f41208d9ede70355aa9867af55d13"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.979957 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.982897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" event={"ID":"57783601-5230-49ef-8ac2-0ddf78bd4b3a","Type":"ContainerStarted","Data":"71ca4f26dab6902a38fc13985df5823e5d5ebc4fa251606ba1aa0ad892a413cb"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.983267 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.991688 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" podStartSLOduration=3.415896578 podStartE2EDuration="31.99166938s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.127169466 +0000 UTC m=+1009.739519790" lastFinishedPulling="2026-02-19 19:35:38.702942268 +0000 UTC m=+1038.315292592" observedRunningTime="2026-02-19 19:35:39.991109562 +0000 UTC m=+1039.603459896" watchObservedRunningTime="2026-02-19 19:35:39.99166938 +0000 UTC m=+1039.604019704" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.994000 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" podStartSLOduration=12.832664663 podStartE2EDuration="31.993991722s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.134293779 +0000 UTC m=+1009.746644093" lastFinishedPulling="2026-02-19 19:35:29.295620828 +0000 UTC m=+1028.907971152" observedRunningTime="2026-02-19 19:35:39.202568813 +0000 UTC m=+1038.814919137" watchObservedRunningTime="2026-02-19 19:35:39.993991722 +0000 UTC m=+1039.606342046" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.016510 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" podStartSLOduration=3.655373883 podStartE2EDuration="32.016493444s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.184776192 +0000 UTC m=+1009.797126516" lastFinishedPulling="2026-02-19 19:35:38.545895753 +0000 UTC m=+1038.158246077" observedRunningTime="2026-02-19 19:35:40.01122497 +0000 UTC m=+1039.623575304" watchObservedRunningTime="2026-02-19 19:35:40.016493444 +0000 UTC m=+1039.628843768" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.043847 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" podStartSLOduration=3.3591667689999998 podStartE2EDuration="32.043826515s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.185043231 +0000 UTC m=+1009.797393555" lastFinishedPulling="2026-02-19 19:35:38.869702977 +0000 UTC m=+1038.482053301" observedRunningTime="2026-02-19 19:35:40.036692804 +0000 UTC m=+1039.649043118" watchObservedRunningTime="2026-02-19 19:35:40.043826515 +0000 UTC m=+1039.656176849" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.510373 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.520184 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.773185 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tmczn" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.781735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.108878 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" podStartSLOduration=4.486920473 podStartE2EDuration="33.108837784s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.922844667 +0000 UTC m=+1009.535194991" lastFinishedPulling="2026-02-19 19:35:38.544761978 +0000 UTC m=+1038.157112302" observedRunningTime="2026-02-19 19:35:40.06033548 +0000 UTC m=+1039.672685794" watchObservedRunningTime="2026-02-19 19:35:41.108837784 +0000 UTC m=+1040.721188108" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.453630 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh"] Feb 19 19:35:41 crc kubenswrapper[4722]: W0219 19:35:41.456769 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8870a7b1_f894_4429_9f52_d9063fe9c780.slice/crio-f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39 WatchSource:0}: Error finding container f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39: Status 404 returned error can't find the container with id f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39 Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.798548 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.798605 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.798652 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.799297 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.799360 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1" gracePeriod=600 Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.008283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" event={"ID":"8870a7b1-f894-4429-9f52-d9063fe9c780","Type":"ContainerStarted","Data":"f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39"} Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.009777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" event={"ID":"421f6539-4fcb-4949-ba29-34997fc98490","Type":"ContainerStarted","Data":"db79c9b9cbeac9eb51c0878d86de830bf4fb98b90efada8b7b4e6c5b99028afa"} Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.009891 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.011759 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1" exitCode=0 Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.011786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1"} Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.011807 4722 scope.go:117] "RemoveContainer" containerID="66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84" Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.026691 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" podStartSLOduration=30.99360685 podStartE2EDuration="34.026671104s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:37.920465327 +0000 UTC m=+1037.532815652" lastFinishedPulling="2026-02-19 19:35:40.953529582 +0000 UTC m=+1040.565879906" observedRunningTime="2026-02-19 19:35:42.024556808 +0000 UTC m=+1041.636907142" watchObservedRunningTime="2026-02-19 19:35:42.026671104 +0000 UTC m=+1041.639021428" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.023146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5"} Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.026408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" event={"ID":"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d","Type":"ContainerStarted","Data":"d6201fb908c7d195c433333c870815690f49158d0e297e9249bae28ad2dcc2a9"} Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.026730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.029430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" event={"ID":"64ff9a64-f79f-4a45-943d-36152964cfcd","Type":"ContainerStarted","Data":"7f19d3f0b25f57ab1510a777e2d1e7208095b102be91033f9de906954cdfc74c"} Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.029795 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.068061 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" podStartSLOduration=2.8579883170000002 podStartE2EDuration="35.068039175s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.328409849 +0000 UTC m=+1009.940760183" lastFinishedPulling="2026-02-19 19:35:42.538460717 +0000 UTC m=+1042.150811041" observedRunningTime="2026-02-19 19:35:43.063033629 +0000 UTC m=+1042.675383963" watchObservedRunningTime="2026-02-19 19:35:43.068039175 +0000 UTC m=+1042.680389499" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.084645 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" podStartSLOduration=1.895200435 podStartE2EDuration="35.084626432s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.462382934 +0000 UTC m=+1009.074733258" lastFinishedPulling="2026-02-19 19:35:42.651808921 +0000 UTC m=+1042.264159255" observedRunningTime="2026-02-19 19:35:43.080796882 +0000 UTC m=+1042.693147216" watchObservedRunningTime="2026-02-19 19:35:43.084626432 +0000 UTC m=+1042.696976766" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.079095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" event={"ID":"8870a7b1-f894-4429-9f52-d9063fe9c780","Type":"ContainerStarted","Data":"963486cba8459481d456b56600ea7f4d785af66779ff1abf7d31cb976a76d52e"} Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.079464 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.107226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" event={"ID":"65b17979-6c94-40e6-ac54-41a61a726e87","Type":"ContainerStarted","Data":"c20e6e7717a98848bf56f29ac034b7985f710e16b337af1496d189a7c5c984c3"} Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.153618 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" podStartSLOduration=2.71085826 podStartE2EDuration="36.153599184s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.129708465 +0000 UTC m=+1009.742058789" lastFinishedPulling="2026-02-19 19:35:43.572449369 +0000 UTC m=+1043.184799713" observedRunningTime="2026-02-19 19:35:44.149552168 +0000 UTC m=+1043.761902492" watchObservedRunningTime="2026-02-19 19:35:44.153599184 +0000 UTC m=+1043.765949508" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.215144 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" podStartSLOduration=34.101983912 podStartE2EDuration="36.215126642s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:41.458645218 +0000 UTC m=+1041.070995542" lastFinishedPulling="2026-02-19 19:35:43.571787948 +0000 UTC m=+1043.184138272" observedRunningTime="2026-02-19 19:35:44.199565127 +0000 UTC m=+1043.811915451" watchObservedRunningTime="2026-02-19 19:35:44.215126642 +0000 UTC m=+1043.827476966" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.789603 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:45 crc kubenswrapper[4722]: I0219 19:35:45.114905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" event={"ID":"766eebc1-05fc-4ca0-8c75-276632a6597e","Type":"ContainerStarted","Data":"488158f43ee85ef4bd2e7550885b354719b06bfeefba9b5443847a65185aeafd"} Feb 19 19:35:45 crc kubenswrapper[4722]: I0219 19:35:45.115307 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:45 crc kubenswrapper[4722]: I0219 19:35:45.134487 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" podStartSLOduration=2.677282515 podStartE2EDuration="37.13447121s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.143325621 +0000 UTC m=+1009.755675945" lastFinishedPulling="2026-02-19 19:35:44.600514316 +0000 UTC m=+1044.212864640" observedRunningTime="2026-02-19 19:35:45.129121053 +0000 UTC m=+1044.741471377" watchObservedRunningTime="2026-02-19 19:35:45.13447121 +0000 UTC m=+1044.746821524" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.395529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.412801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.441554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.522446 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.599098 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.632500 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.647330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.667669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.755332 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.802708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.817421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.856852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.868727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.996280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.028851 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.079305 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.103732 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.141589 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:50 crc kubenswrapper[4722]: I0219 19:35:50.791075 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:54 crc kubenswrapper[4722]: I0219 19:35:54.432523 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:58 crc kubenswrapper[4722]: I0219 19:35:58.621379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.906678 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.908305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910600 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910777 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kvsd7" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910719 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.919743 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.969953 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.971068 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.973579 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.986665 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995271 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.096698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.096867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.097161 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.097216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.097285 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.098133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.098470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.098964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.118879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.120193 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.232660 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.284822 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.658701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.754647 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:20 crc kubenswrapper[4722]: W0219 19:36:20.764611 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7421fc0e_3cfa_49af_a4c0_90807314bb61.slice/crio-9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163 WatchSource:0}: Error finding container 9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163: Status 404 returned error can't find the container with id 9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163 Feb 19 19:36:21 crc kubenswrapper[4722]: I0219 19:36:21.408508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" event={"ID":"7421fc0e-3cfa-49af-a4c0-90807314bb61","Type":"ContainerStarted","Data":"9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163"} Feb 19 19:36:21 crc kubenswrapper[4722]: I0219 19:36:21.410375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" event={"ID":"af3a7297-2590-4a47-baa0-cd5b6029b6a4","Type":"ContainerStarted","Data":"bfeedc6c93dfc0ff6029b23856b61146b86096c4ced635aad0f319bef89af5a1"} Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.596017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.627488 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.628848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.635888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.635992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.636015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.653373 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.737303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.737374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.737392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.738310 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.738317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.770646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.945590 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.953694 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.970794 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.972337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.992284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.142450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.145457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.146710 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.248427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.248473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.248578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.249365 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.250118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.272581 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.352132 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.563524 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:36:23 crc kubenswrapper[4722]: W0219 19:36:23.571170 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d3acaf5_2a64_4dbd_8a74_f4e10cbd5465.slice/crio-f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00 WatchSource:0}: Error finding container f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00: Status 404 returned error can't find the container with id f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00 Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.794392 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.796078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.798820 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799105 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799201 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cbm8q" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799317 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799434 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799587 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.800131 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.806754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.871914 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957382 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957564 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060191 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060214 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060250 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.061575 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.063879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.064976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.065399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.065778 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.066569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.067325 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.067356 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5f6cee635ca5e2d348cf915d62a0dac8d2194b66bba55200fe901088eac3f7dd/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.067323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.080798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.084724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.086955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.101479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.124507 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.127520 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.132677 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.136791 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.132549 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.137657 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139510 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139544 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139552 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qdf2m" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.146778 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.271450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.271887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.271957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373514 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373547 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.376601 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.376888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.377442 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.377615 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.377870 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393307 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393884 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393910 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6408a1f41ebba08884844654cc07aafa4a02aa7486293e45dd19f823f7662d43/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.406821 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.474352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.514457 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerStarted","Data":"967b9f32e64a62fdd1e64949dcea547f81e3dffcf01eb6300e42284f5721c31d"} Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.515430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerStarted","Data":"f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00"} Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.767669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.811003 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.330855 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.422144 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.423561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.435463 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436052 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436274 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436426 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436559 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-98pfc" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.537458 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerStarted","Data":"24279a6d2caf7ad4b1f181fa89124ed3ff752cfc1180df75df7a96c88d0345e2"} Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620188 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-default\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620327 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kolla-config\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620358 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82cjh\" (UniqueName: \"kubernetes.io/projected/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kube-api-access-82cjh\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-default\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kolla-config\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725510 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82cjh\" (UniqueName: \"kubernetes.io/projected/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kube-api-access-82cjh\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.727978 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.728023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-default\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.729379 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kolla-config\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.731134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.761944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.762093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.762442 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.762486 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c8f62df97f865453863e26b12aa68d2572f80e4101124fe07995cb8bbe4bb98/globalmount\"" pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.766756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82cjh\" (UniqueName: \"kubernetes.io/projected/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kube-api-access-82cjh\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.815344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.062090 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.853706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.855941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.862373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.862797 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8hfxd" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.865567 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.865792 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.877534 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.925136 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.926414 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.935993 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.936462 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.936563 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.936730 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r9hpl" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-config-data\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kolla-config\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kube-api-access-f4gzc\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqwb\" (UniqueName: \"kubernetes.io/projected/a07f9633-74f5-48e5-8467-d649fc49a2ff-kube-api-access-twqwb\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-config-data\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kolla-config\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155798 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kube-api-access-f4gzc\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqwb\" (UniqueName: \"kubernetes.io/projected/a07f9633-74f5-48e5-8467-d649fc49a2ff-kube-api-access-twqwb\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.157084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kolla-config\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.157339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.158140 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.158762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.158807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-config-data\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.160338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.164751 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.166538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.176629 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.181098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kube-api-access-f4gzc\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.181278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqwb\" (UniqueName: \"kubernetes.io/projected/a07f9633-74f5-48e5-8467-d649fc49a2ff-kube-api-access-twqwb\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.181755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.184109 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.189352 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89de2f74d00e86341d558d2e9eae6b444d9d706847f448e590c53d8ab50a529c/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.228508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.248174 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.478689 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:28 crc kubenswrapper[4722]: I0219 19:36:28.987126 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:36:28 crc kubenswrapper[4722]: I0219 19:36:28.988340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:36:28 crc kubenswrapper[4722]: I0219 19:36:28.991085 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cf9vh" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.000128 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.098434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"kube-state-metrics-0\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.200314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"kube-state-metrics-0\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.245935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"kube-state-metrics-0\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.364064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.640445 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.641918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-k49sx" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643960 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.644004 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.655490 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.808942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809237 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcnq\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-kube-api-access-mxcnq\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.910898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcnq\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-kube-api-access-mxcnq\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911072 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.912025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.918350 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.923717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.924569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.924577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.928617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.938748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcnq\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-kube-api-access-mxcnq\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.960505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.316974 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.319434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.322523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.322809 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.323014 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9sq" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325419 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325592 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325832 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.334924 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519849 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520351 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621753 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.622486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.622569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.622689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.625396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.625664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630686 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630728 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/991d7114ade43d3df67520db88811056b16c48c5086e58d4724863cd9821be9f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.640501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.647791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.665176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.942619 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.123001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6tmmr"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.132423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.135048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fwvrs"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.136848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.138638 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mm6tk" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.138903 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.139118 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.145584 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.182353 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fwvrs"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.213653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-lib\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8300e35-4c72-4398-9058-0aa76005d576-scripts\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214351 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-log-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-log\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-ovn-controller-tls-certs\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215064 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55j87\" (UniqueName: \"kubernetes.io/projected/c8300e35-4c72-4398-9058-0aa76005d576-kube-api-access-55j87\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-combined-ca-bundle\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrdq\" (UniqueName: \"kubernetes.io/projected/293cde43-7bcf-4638-a080-badb26c81138-kube-api-access-8hrdq\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-etc-ovs\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.216125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-run\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.216193 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/293cde43-7bcf-4638-a080-badb26c81138-scripts\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.216916 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.268832 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.270292 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.274577 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.274809 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.274946 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.275117 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.276103 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.281896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kqwrn" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318751 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55j87\" (UniqueName: \"kubernetes.io/projected/c8300e35-4c72-4398-9058-0aa76005d576-kube-api-access-55j87\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-combined-ca-bundle\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrdq\" (UniqueName: \"kubernetes.io/projected/293cde43-7bcf-4638-a080-badb26c81138-kube-api-access-8hrdq\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-etc-ovs\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-run\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/293cde43-7bcf-4638-a080-badb26c81138-scripts\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318980 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-lib\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8300e35-4c72-4398-9058-0aa76005d576-scripts\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-log-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-log\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-ovn-controller-tls-certs\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.321328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/293cde43-7bcf-4638-a080-badb26c81138-scripts\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.321950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.321993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8300e35-4c72-4398-9058-0aa76005d576-scripts\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322035 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-log\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-log-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-run\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-lib\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322255 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-etc-ovs\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.328085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-ovn-controller-tls-certs\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.328095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-combined-ca-bundle\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.342885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55j87\" (UniqueName: \"kubernetes.io/projected/c8300e35-4c72-4398-9058-0aa76005d576-kube-api-access-55j87\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.346892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrdq\" (UniqueName: \"kubernetes.io/projected/293cde43-7bcf-4638-a080-badb26c81138-kube-api-access-8hrdq\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420248 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9hv\" (UniqueName: \"kubernetes.io/projected/13228713-9349-4241-b1f7-67f9a2c705fa-kube-api-access-nr9hv\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420641 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.459964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.481727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522162 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522220 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9hv\" (UniqueName: \"kubernetes.io/projected/13228713-9349-4241-b1f7-67f9a2c705fa-kube-api-access-nr9hv\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.523860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.524007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.527110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.527130 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.527776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.532827 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.532876 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/98f583678ca9c1c12773c53e7f05cca773d292af92e8eebf2b9af8f5c6e51d46/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.546182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9hv\" (UniqueName: \"kubernetes.io/projected/13228713-9349-4241-b1f7-67f9a2c705fa-kube-api-access-nr9hv\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.569371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.598607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.737992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerStarted","Data":"3a2845abf856d9cafaeec46534beacb5f3f1990d5bed57b69cf295f8fe01e4f1"} Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.535773 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.537589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.539909 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m656n" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.541523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.542257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.547993 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.559384 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.714381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.714772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-config\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.714929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdvv9\" (UniqueName: \"kubernetes.io/projected/05a27e5a-189e-4d17-9823-d95ef7906a7b-kube-api-access-qdvv9\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdvv9\" (UniqueName: \"kubernetes.io/projected/05a27e5a-189e-4d17-9823-d95ef7906a7b-kube-api-access-qdvv9\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817919 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.818021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-config\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.818039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.818939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.819646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-config\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.820056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.823986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.824028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.824253 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.824284 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ac2df717fdc68bee41330e6395ff3fb3a7398b56e12dd8c849b5929a74aef50f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.825084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.842103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdvv9\" (UniqueName: \"kubernetes.io/projected/05a27e5a-189e-4d17-9823-d95ef7906a7b-kube-api-access-qdvv9\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.863098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:38 crc kubenswrapper[4722]: I0219 19:36:38.157440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.153414 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.154423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.158551 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.158867 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.159183 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-h7lsd" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.159437 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.159878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.173010 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pbhw\" (UniqueName: \"kubernetes.io/projected/aba36975-65f4-4f71-a709-261d2b9255ea-kube-api-access-8pbhw\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pbhw\" (UniqueName: \"kubernetes.io/projected/aba36975-65f4-4f71-a709-261d2b9255ea-kube-api-access-8pbhw\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.359230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.360499 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.364751 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.365698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.375576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pbhw\" (UniqueName: \"kubernetes.io/projected/aba36975-65f4-4f71-a709-261d2b9255ea-kube-api-access-8pbhw\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.378341 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.395030 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.395178 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.397482 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.399098 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.401532 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462650 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462762 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8q97\" (UniqueName: \"kubernetes.io/projected/cad6276e-0607-49e0-8a90-a11e9b916991-kube-api-access-n8q97\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.473521 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.474061 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.474517 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.485100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.485103 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.493444 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.564859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npn74\" (UniqueName: \"kubernetes.io/projected/9babbc99-4133-47c1-85e5-95039351727b-kube-api-access-npn74\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.564931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.564976 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8q97\" (UniqueName: \"kubernetes.io/projected/cad6276e-0607-49e0-8a90-a11e9b916991-kube-api-access-n8q97\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.566436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.567008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.574344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.574892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.575863 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.580955 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.582111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.588548 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.588751 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.588943 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.591481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.592059 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.595634 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8q97\" (UniqueName: \"kubernetes.io/projected/cad6276e-0607-49e0-8a90-a11e9b916991-kube-api-access-n8q97\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.595963 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.598730 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.626119 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.627321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.633053 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-b5w4v" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.651488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667729 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6mx\" (UniqueName: \"kubernetes.io/projected/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-kube-api-access-vg6mx\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668336 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668363 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npn74\" (UniqueName: \"kubernetes.io/projected/9babbc99-4133-47c1-85e5-95039351727b-kube-api-access-npn74\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgtc\" (UniqueName: \"kubernetes.io/projected/47cbe0b4-7d45-486b-9e9b-964db524e7ab-kube-api-access-wsgtc\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.670027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.672592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.675182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.691015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npn74\" (UniqueName: \"kubernetes.io/projected/9babbc99-4133-47c1-85e5-95039351727b-kube-api-access-npn74\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.697237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.753572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769426 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgtc\" (UniqueName: \"kubernetes.io/projected/47cbe0b4-7d45-486b-9e9b-964db524e7ab-kube-api-access-wsgtc\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6mx\" (UniqueName: \"kubernetes.io/projected/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-kube-api-access-vg6mx\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769600 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769660 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769766 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769831 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.769947 4722 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.769990 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret podName:fc37f35d-ac2f-40a0-90e1-40c3b80b1782 nodeName:}" failed. No retries permitted until 2026-02-19 19:36:41.269972781 +0000 UTC m=+1100.882323105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret") pod "cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" (UID: "fc37f35d-ac2f-40a0-90e1-40c3b80b1782") : secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.770232 4722 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.770264 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret podName:47cbe0b4-7d45-486b-9e9b-964db524e7ab nodeName:}" failed. No retries permitted until 2026-02-19 19:36:41.27025418 +0000 UTC m=+1100.882604504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret") pod "cloudkitty-lokistack-gateway-7f8685b49f-2j29g" (UID: "47cbe0b4-7d45-486b-9e9b-964db524e7ab") : secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.771284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.772494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.773088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.773770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.774094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.774382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.775004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.775945 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.776360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.777515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.777545 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.778280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.778640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.782364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.798388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6mx\" (UniqueName: \"kubernetes.io/projected/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-kube-api-access-vg6mx\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.798854 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.804935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgtc\" (UniqueName: \"kubernetes.io/projected/47cbe0b4-7d45-486b-9e9b-964db524e7ab-kube-api-access-wsgtc\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.276962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.277287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.287108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.291164 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.330491 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.331474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.337767 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.338069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.350027 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.425687 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.428945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.431472 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.432349 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.442659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9znm\" (UniqueName: \"kubernetes.io/projected/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-kube-api-access-p9znm\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.528116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.529466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.533647 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.534215 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.563232 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.571166 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.578908 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhls5\" (UniqueName: \"kubernetes.io/projected/53bc8f19-43b1-4297-a3db-986381793b6e-kube-api-access-hhls5\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584090 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9znm\" (UniqueName: \"kubernetes.io/projected/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-kube-api-access-p9znm\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584704 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.585402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.585459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584717 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.587886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.588108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.614005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9znm\" (UniqueName: \"kubernetes.io/projected/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-kube-api-access-p9znm\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.617844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.633541 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.646940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686937 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhls5\" (UniqueName: \"kubernetes.io/projected/53bc8f19-43b1-4297-a3db-986381793b6e-kube-api-access-hhls5\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686979 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687072 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrxm\" (UniqueName: \"kubernetes.io/projected/15869f30-52a4-4db0-aca8-53c5b319f7a1-kube-api-access-rkrxm\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687169 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.689189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.690704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.691824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.692059 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.695842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.704082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.704641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhls5\" (UniqueName: \"kubernetes.io/projected/53bc8f19-43b1-4297-a3db-986381793b6e-kube-api-access-hhls5\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.710798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.746222 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.789933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrxm\" (UniqueName: \"kubernetes.io/projected/15869f30-52a4-4db0-aca8-53c5b319f7a1-kube-api-access-rkrxm\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.791013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792438 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.795528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.795525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.797897 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.812005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrxm\" (UniqueName: \"kubernetes.io/projected/15869f30-52a4-4db0-aca8-53c5b319f7a1-kube-api-access-rkrxm\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.821205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.891689 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.945818 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.065962 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.066664 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8pmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-nngml_openstack(af3a7297-2590-4a47-baa0-cd5b6029b6a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.067861 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" podUID="af3a7297-2590-4a47-baa0-cd5b6029b6a4" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.145599 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.145773 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcgth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9t97w_openstack(7421fc0e-3cfa-49af-a4c0-90807314bb61): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.147345 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" podUID="7421fc0e-3cfa-49af-a4c0-90807314bb61" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.217097 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.217285 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpz6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-jmsp2_openstack(17b6c8b5-9711-4601-a0fd-a1f528e97287): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.219551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.241657 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.242094 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kf2km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-dt86l_openstack(4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.248249 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.907867 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.915461 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" Feb 19 19:36:48 crc kubenswrapper[4722]: I0219 19:36:48.964099 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.065807 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.190707 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.226143 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.784755 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.796566 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.909389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.909580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerStarted","Data":"c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.917496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerStarted","Data":"f1e06939ba16d5e69507fa3a579142662a387babc305acdf6b56da52073daf71"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.919418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" event={"ID":"af3a7297-2590-4a47-baa0-cd5b6029b6a4","Type":"ContainerDied","Data":"bfeedc6c93dfc0ff6029b23856b61146b86096c4ced635aad0f319bef89af5a1"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.919472 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.920556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"059950bd-4e60-42e6-a9c6-4e4ab0b039aa","Type":"ContainerStarted","Data":"29233f754c6e0594fc2dce8d38065ec8f549ab8811984c4eb517f6e5dc70ca5f"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.922388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" event={"ID":"7421fc0e-3cfa-49af-a4c0-90807314bb61","Type":"ContainerDied","Data":"9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.922766 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.931695 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.945469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerStarted","Data":"d4965ef535ae6d75f99b0767c9cb18768aaccd6cfe6fd189dc3e54bc294d5b21"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.952970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"9eab85e47b8d4d46d4d09632c13f9eaf4c33976add1d0726acbd83490a88c6c1"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957217 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"7421fc0e-3cfa-49af-a4c0-90807314bb61\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"7421fc0e-3cfa-49af-a4c0-90807314bb61\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957643 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"7421fc0e-3cfa-49af-a4c0-90807314bb61\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.958287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7421fc0e-3cfa-49af-a4c0-90807314bb61" (UID: "7421fc0e-3cfa-49af-a4c0-90807314bb61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.958326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config" (OuterVolumeSpecName: "config") pod "7421fc0e-3cfa-49af-a4c0-90807314bb61" (UID: "7421fc0e-3cfa-49af-a4c0-90807314bb61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.958326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config" (OuterVolumeSpecName: "config") pod "af3a7297-2590-4a47-baa0-cd5b6029b6a4" (UID: "af3a7297-2590-4a47-baa0-cd5b6029b6a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.959335 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.959370 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.959378 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.962678 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.963795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth" (OuterVolumeSpecName: "kube-api-access-hcgth") pod "7421fc0e-3cfa-49af-a4c0-90807314bb61" (UID: "7421fc0e-3cfa-49af-a4c0-90807314bb61"). InnerVolumeSpecName "kube-api-access-hcgth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.965795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh" (OuterVolumeSpecName: "kube-api-access-r8pmh") pod "af3a7297-2590-4a47-baa0-cd5b6029b6a4" (UID: "af3a7297-2590-4a47-baa0-cd5b6029b6a4"). InnerVolumeSpecName "kube-api-access-r8pmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: W0219 19:36:49.977457 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08df2e8_3f03_4e9c_91cf_2890026b9d76.slice/crio-3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5 WatchSource:0}: Error finding container 3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5: Status 404 returned error can't find the container with id 3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5 Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.981306 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: W0219 19:36:49.983840 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9babbc99_4133_47c1_85e5_95039351727b.slice/crio-8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed WatchSource:0}: Error finding container 8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed: Status 404 returned error can't find the container with id 8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.061398 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.061438 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.273777 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.297944 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.312011 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.328709 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.364755 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:50 crc kubenswrapper[4722]: W0219 19:36:50.371629 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fc19f1_6f9f_4f35_a391_1f6743480bd3.slice/crio-49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff WatchSource:0}: Error finding container 49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff: Status 404 returned error can't find the container with id 49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.374204 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.396076 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.418128 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:50 crc kubenswrapper[4722]: W0219 19:36:50.427281 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47cbe0b4_7d45_486b_9e9b_964db524e7ab.slice/crio-887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce WatchSource:0}: Error finding container 887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce: Status 404 returned error can't find the container with id 887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.432138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.460088 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.460198 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.470165 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c"] Feb 19 19:36:50 crc kubenswrapper[4722]: W0219 19:36:50.539668 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53bc8f19_43b1_4297_a3db_986381793b6e.slice/crio-9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3 WatchSource:0}: Error finding container 9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3: Status 404 returned error can't find the container with id 9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3 Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.541518 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fwvrs"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.964829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"53bc8f19-43b1-4297-a3db-986381793b6e","Type":"ContainerStarted","Data":"9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.967657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"15869f30-52a4-4db0-aca8-53c5b319f7a1","Type":"ContainerStarted","Data":"aca690584825492c1c0bff04b230724017b6329ddea13ba665c0bc38724b6aac"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.968691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerStarted","Data":"e660effba251555c0c949b7e36ff3e2dc66e7b37a63e0229bb54eb33f555314b"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.970072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" event={"ID":"9babbc99-4133-47c1-85e5-95039351727b","Type":"ContainerStarted","Data":"8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.974082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr" event={"ID":"293cde43-7bcf-4638-a080-badb26c81138","Type":"ContainerStarted","Data":"c1ba056f6d26a097d50633f3948e29ae5a7be3e9d18eae86e25eecf62e11ef0a"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.976298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" event={"ID":"47cbe0b4-7d45-486b-9e9b-964db524e7ab","Type":"ContainerStarted","Data":"887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.979289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerStarted","Data":"e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.982826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerStarted","Data":"5545dc8f3e2de249c7840626da07d4ee4ba5dd553856353617c9f89c2873d54d"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.984486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13228713-9349-4241-b1f7-67f9a2c705fa","Type":"ContainerStarted","Data":"291a68b55c5e55f85f7ff850e7faae5b89ae9c96777eeecedb246b9d8bc560b6"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.985738 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" event={"ID":"aba36975-65f4-4f71-a709-261d2b9255ea","Type":"ContainerStarted","Data":"5cc4476d61dd69eeb9b1a29772eb754a6f66b053a97c83b9236a853d67923df4"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.986833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" event={"ID":"fc37f35d-ac2f-40a0-90e1-40c3b80b1782","Type":"ContainerStarted","Data":"11f7874fd713795c3df3569dfb6f3b2b4fc68b64a645b032ff0ea869617a710d"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.994272 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5"} Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.026300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"a3fc19f1-6f9f-4f35-a391-1f6743480bd3","Type":"ContainerStarted","Data":"49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff"} Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.031996 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" event={"ID":"cad6276e-0607-49e0-8a90-a11e9b916991","Type":"ContainerStarted","Data":"23cfa51f66b510a9f59c6a0ad5b9d6cad884a31e01018fe52264bc7647f4333e"} Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.086007 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7421fc0e-3cfa-49af-a4c0-90807314bb61" path="/var/lib/kubelet/pods/7421fc0e-3cfa-49af-a4c0-90807314bb61/volumes" Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.086410 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3a7297-2590-4a47-baa0-cd5b6029b6a4" path="/var/lib/kubelet/pods/af3a7297-2590-4a47-baa0-cd5b6029b6a4/volumes" Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.320083 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 19:36:53 crc kubenswrapper[4722]: I0219 19:36:53.068211 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05a27e5a-189e-4d17-9823-d95ef7906a7b","Type":"ContainerStarted","Data":"d68eda65528da0413baa83abe4fa0066a48a389941ffe61388b361110f4fd855"} Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.417823 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.418784 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h96h589h559h54ch5ddh89h5fdh5c4h575h578h696h589h5dbh594h9bhb5hf5h655h57dh54dh558h55ch54h67fh5bch66bhc5h577h687h598h76q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-55j87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-fwvrs_openstack(c8300e35-4c72-4398-9058-0aa76005d576): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.420860 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-fwvrs" podUID="c8300e35-4c72-4398-9058-0aa76005d576" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.490908 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.491100 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twqwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(a07f9633-74f5-48e5-8467-d649fc49a2ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.492260 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="a07f9633-74f5-48e5-8467-d649fc49a2ff" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.104791 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.105016 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg6mx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-qxjk2_openstack(fc37f35d-ac2f-40a0-90e1-40c3b80b1782): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.106440 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" podUID="fc37f35d-ac2f-40a0-90e1-40c3b80b1782" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.127292 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.127479 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsgtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-2j29g_openstack(47cbe0b4-7d45-486b-9e9b-964db524e7ab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.128713 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" podUID="47cbe0b4-7d45-486b-9e9b-964db524e7ab" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.196838 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-fwvrs" podUID="c8300e35-4c72-4398-9058-0aa76005d576" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.201321 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.201524 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-ingester,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=ingester -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:wal,ReadOnly:false,MountPath:/tmp/wal,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9znm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-ingester-0_openstack(a3fc19f1-6f9f-4f35-a391-1f6743480bd3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.202735 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.202756 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" podUID="fc37f35d-ac2f-40a0-90e1-40c3b80b1782" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.202729 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" podUID="47cbe0b4-7d45-486b-9e9b-964db524e7ab" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.203645 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.203794 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n8q97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-58c84b5844-k6gcm_openstack(cad6276e-0607-49e0-8a90-a11e9b916991): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.204976 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" podUID="cad6276e-0607-49e0-8a90-a11e9b916991" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.208860 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.209015 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/prometheus/config/prometheus.yaml.gz --config-envsubst-file=/etc/prometheus/config_out/prometheus.env.yaml --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-init,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/prometheus/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nrjf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(a08df2e8-3f03-4e9c-91cf-2890026b9d76): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.210119 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.465197 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.465757 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.465882 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8c6z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(14a7aae0-6a51-49ed-b4dd-9b274885d1da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.467088 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.206918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerStarted","Data":"c4e1aaf5987ae7613ffce2d46a9e81da3905d6934748687233a4bb3c5c36ee1f"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.208859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"15869f30-52a4-4db0-aca8-53c5b319f7a1","Type":"ContainerStarted","Data":"7e4f75abf37ffede84097a88c6af1bf378a8112527438e23b556b46ba20eb725"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.209193 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.210449 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" event={"ID":"9babbc99-4133-47c1-85e5-95039351727b","Type":"ContainerStarted","Data":"03851b96b20e4de777f586e529e4d46af2053221200c0ef3829199d739e1473d"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.210693 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.213119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"059950bd-4e60-42e6-a9c6-4e4ab0b039aa","Type":"ContainerStarted","Data":"c04723d9f530225ce94b410cbfa59b3fac45d29eee478a25a703ed87bb459cf2"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.213173 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 19:37:07 crc kubenswrapper[4722]: E0219 19:37:07.218858 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.256177 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=11.784457127 podStartE2EDuration="27.256138672s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.469184272 +0000 UTC m=+1110.081534596" lastFinishedPulling="2026-02-19 19:37:05.940865817 +0000 UTC m=+1125.553216141" observedRunningTime="2026-02-19 19:37:07.249133795 +0000 UTC m=+1126.861484159" watchObservedRunningTime="2026-02-19 19:37:07.256138672 +0000 UTC m=+1126.868488996" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.270142 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" podStartSLOduration=11.540327502 podStartE2EDuration="27.270123577s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.00341815 +0000 UTC m=+1109.615768474" lastFinishedPulling="2026-02-19 19:37:05.733214225 +0000 UTC m=+1125.345564549" observedRunningTime="2026-02-19 19:37:07.264530813 +0000 UTC m=+1126.876881137" watchObservedRunningTime="2026-02-19 19:37:07.270123577 +0000 UTC m=+1126.882473901" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.309748 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.987636365 podStartE2EDuration="41.30972427s" podCreationTimestamp="2026-02-19 19:36:26 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.302338705 +0000 UTC m=+1108.914689029" lastFinishedPulling="2026-02-19 19:37:05.62442661 +0000 UTC m=+1125.236776934" observedRunningTime="2026-02-19 19:37:07.301130932 +0000 UTC m=+1126.913481256" watchObservedRunningTime="2026-02-19 19:37:07.30972427 +0000 UTC m=+1126.922074594" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.218709 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr" event={"ID":"293cde43-7bcf-4638-a080-badb26c81138","Type":"ContainerStarted","Data":"d1883e01a92455fb7d08ab0d5b74082dd3d67da4295aa6046b5334330d7445e4"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.219036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6tmmr" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.220972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" event={"ID":"aba36975-65f4-4f71-a709-261d2b9255ea","Type":"ContainerStarted","Data":"1db9ab7e51ecff4bb8b508a812b573f7e7fa538e8898e96a6c05d7db6dd9f9c6"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.221120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.223014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13228713-9349-4241-b1f7-67f9a2c705fa","Type":"ContainerStarted","Data":"62db51a139bce149b57aa8c7df3c09f8b6a05984eb5694fd234c9b52b703e4bc"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.224511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05a27e5a-189e-4d17-9823-d95ef7906a7b","Type":"ContainerStarted","Data":"f484cd67b93ed6d2d71d614d82319268658739c5cac1f6e4267d76484d315d23"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.226710 4722 generic.go:334] "Generic (PLEG): container finished" podID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" exitCode=0 Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.226757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerDied","Data":"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.232428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerStarted","Data":"13563db377a2d356d6c5e051100eb3fdec737b3d97a75f97173241fd5519e50d"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.235692 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"53bc8f19-43b1-4297-a3db-986381793b6e","Type":"ContainerStarted","Data":"a6e3a7de83d225eadcc665957ce6c87d1619bbdc666aa3816ec19858e9188fdd"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.236457 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.247762 4722 generic.go:334] "Generic (PLEG): container finished" podID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" exitCode=0 Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.247861 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerDied","Data":"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.248911 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6tmmr" podStartSLOduration=19.771968806 podStartE2EDuration="35.248881713s" podCreationTimestamp="2026-02-19 19:36:33 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.372435112 +0000 UTC m=+1109.984785436" lastFinishedPulling="2026-02-19 19:37:05.849348019 +0000 UTC m=+1125.461698343" observedRunningTime="2026-02-19 19:37:08.238902762 +0000 UTC m=+1127.851253086" watchObservedRunningTime="2026-02-19 19:37:08.248881713 +0000 UTC m=+1127.861232047" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.283940 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=12.89198666 podStartE2EDuration="28.283922263s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.548737028 +0000 UTC m=+1110.161087352" lastFinishedPulling="2026-02-19 19:37:05.940672641 +0000 UTC m=+1125.553022955" observedRunningTime="2026-02-19 19:37:08.268202214 +0000 UTC m=+1127.880552558" watchObservedRunningTime="2026-02-19 19:37:08.283922263 +0000 UTC m=+1127.896272587" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.300980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" podStartSLOduration=12.902858558 podStartE2EDuration="28.300956773s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.542615347 +0000 UTC m=+1110.154965671" lastFinishedPulling="2026-02-19 19:37:05.940713562 +0000 UTC m=+1125.553063886" observedRunningTime="2026-02-19 19:37:08.289412103 +0000 UTC m=+1127.901762477" watchObservedRunningTime="2026-02-19 19:37:08.300956773 +0000 UTC m=+1127.913307117" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.261851 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerStarted","Data":"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.263712 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.267633 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" event={"ID":"cad6276e-0607-49e0-8a90-a11e9b916991","Type":"ContainerStarted","Data":"ba81c6123c8a4f0cc4800e1eccdf8da2890142be6cd3b5f5778d095d7e56cccf"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.267864 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.269721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerStarted","Data":"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.270292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.272226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"a3fc19f1-6f9f-4f35-a391-1f6743480bd3","Type":"ContainerStarted","Data":"ab71ad53934879b34f098a9a5647e7bcb7133425b126895ab692a938daacf85d"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.272986 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.295011 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podStartSLOduration=4.719654303 podStartE2EDuration="47.294989363s" podCreationTimestamp="2026-02-19 19:36:22 +0000 UTC" firstStartedPulling="2026-02-19 19:36:23.886493827 +0000 UTC m=+1083.498844151" lastFinishedPulling="2026-02-19 19:37:06.461828887 +0000 UTC m=+1126.074179211" observedRunningTime="2026-02-19 19:37:09.28237241 +0000 UTC m=+1128.894722744" watchObservedRunningTime="2026-02-19 19:37:09.294989363 +0000 UTC m=+1128.907339687" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.335433 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podStartSLOduration=4.451686527 podStartE2EDuration="47.33539446s" podCreationTimestamp="2026-02-19 19:36:22 +0000 UTC" firstStartedPulling="2026-02-19 19:36:23.574118649 +0000 UTC m=+1083.186468973" lastFinishedPulling="2026-02-19 19:37:06.457826582 +0000 UTC m=+1126.070176906" observedRunningTime="2026-02-19 19:37:09.296225561 +0000 UTC m=+1128.908575895" watchObservedRunningTime="2026-02-19 19:37:09.33539446 +0000 UTC m=+1128.947744784" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.346447 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=-9223372007.508347 podStartE2EDuration="29.346429443s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.374825336 +0000 UTC m=+1109.987175660" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:09.323731638 +0000 UTC m=+1128.936082002" watchObservedRunningTime="2026-02-19 19:37:09.346429443 +0000 UTC m=+1128.958779767" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.358016 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" podStartSLOduration=-9223372007.496778 podStartE2EDuration="29.357997613s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.294557829 +0000 UTC m=+1109.906908153" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:09.338380103 +0000 UTC m=+1128.950730447" watchObservedRunningTime="2026-02-19 19:37:09.357997613 +0000 UTC m=+1128.970347937" Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.283769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.285759 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"873d37ee99fb511c8da26dd67c2f29770d664f57f955aa7d18b5dc0f234df076"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.287429 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13228713-9349-4241-b1f7-67f9a2c705fa","Type":"ContainerStarted","Data":"e05fde647fb1e3dd532c2cdd72bd74727b4395da4d5ee52337145b8d1d53bc36"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.288953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05a27e5a-189e-4d17-9823-d95ef7906a7b","Type":"ContainerStarted","Data":"c7e351be424166d5d7820a88ce1149818ad63b6b821866688b2c8efa855c70f0"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.324881 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.847774373 podStartE2EDuration="38.324864068s" podCreationTimestamp="2026-02-19 19:36:32 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.345498204 +0000 UTC m=+1109.957848518" lastFinishedPulling="2026-02-19 19:37:09.822587879 +0000 UTC m=+1129.434938213" observedRunningTime="2026-02-19 19:37:10.323904108 +0000 UTC m=+1129.936254422" watchObservedRunningTime="2026-02-19 19:37:10.324864068 +0000 UTC m=+1129.937214392" Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.368337 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.477743775 podStartE2EDuration="34.36831922s" podCreationTimestamp="2026-02-19 19:36:36 +0000 UTC" firstStartedPulling="2026-02-19 19:36:52.92284877 +0000 UTC m=+1112.535199094" lastFinishedPulling="2026-02-19 19:37:09.813424215 +0000 UTC m=+1129.425774539" observedRunningTime="2026-02-19 19:37:10.363445458 +0000 UTC m=+1129.975795802" watchObservedRunningTime="2026-02-19 19:37:10.36831922 +0000 UTC m=+1129.980669534" Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.158659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.232831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.297446 4722 generic.go:334] "Generic (PLEG): container finished" podID="a07f9633-74f5-48e5-8467-d649fc49a2ff" containerID="c4e1aaf5987ae7613ffce2d46a9e81da3905d6934748687233a4bb3c5c36ee1f" exitCode=0 Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.297582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerDied","Data":"c4e1aaf5987ae7613ffce2d46a9e81da3905d6934748687233a4bb3c5c36ee1f"} Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.298407 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.250434 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.311191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerStarted","Data":"8f4e164cf5a8efd63d40164ec3935ff84db296d1d35f8d95c33fe3839e08d122"} Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.359100 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371989.495707 podStartE2EDuration="47.359068934s" podCreationTimestamp="2026-02-19 19:36:25 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.210066374 +0000 UTC m=+1108.822416698" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:12.345275825 +0000 UTC m=+1131.957626169" watchObservedRunningTime="2026-02-19 19:37:12.359068934 +0000 UTC m=+1131.971419288" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.379675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.599628 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.631907 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.632686 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" containerID="cri-o://3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" gracePeriod=10 Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.654297 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.656025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.662322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.668683 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.675748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758520 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758562 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.822500 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-282bs"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.830937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.834005 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.862085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.862328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.862972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.863086 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-282bs"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.918804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.962606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9470e2b8-0f01-4735-8050-1bae363b3a02-config\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.962868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-combined-ca-bundle\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.962990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsn9\" (UniqueName: \"kubernetes.io/projected/9470e2b8-0f01-4735-8050-1bae363b3a02-kube-api-access-xbsn9\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.963110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovs-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.963212 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovn-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.963287 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.024704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.064601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9470e2b8-0f01-4735-8050-1bae363b3a02-config\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-combined-ca-bundle\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsn9\" (UniqueName: \"kubernetes.io/projected/9470e2b8-0f01-4735-8050-1bae363b3a02-kube-api-access-xbsn9\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065137 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovs-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovn-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065653 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovs-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.066200 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovn-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.066720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9470e2b8-0f01-4735-8050-1bae363b3a02-config\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.072857 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-combined-ca-bundle\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.101424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.112746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsn9\" (UniqueName: \"kubernetes.io/projected/9470e2b8-0f01-4735-8050-1bae363b3a02-kube-api-access-xbsn9\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.147267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.174639 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.265004 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.265249 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" containerID="cri-o://a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" gracePeriod=10 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.270923 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"17b6c8b5-9711-4601-a0fd-a1f528e97287\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.271010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"17b6c8b5-9711-4601-a0fd-a1f528e97287\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.271172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"17b6c8b5-9711-4601-a0fd-a1f528e97287\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.280349 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x" (OuterVolumeSpecName: "kube-api-access-wpz6x") pod "17b6c8b5-9711-4601-a0fd-a1f528e97287" (UID: "17b6c8b5-9711-4601-a0fd-a1f528e97287"). InnerVolumeSpecName "kube-api-access-wpz6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.299432 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.299843 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.299855 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.299880 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="init" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.299887 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="init" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.300038 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.301075 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.305166 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.337891 4722 generic.go:334] "Generic (PLEG): container finished" podID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" exitCode=0 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.337969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerDied","Data":"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8"} Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.338017 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerDied","Data":"967b9f32e64a62fdd1e64949dcea547f81e3dffcf01eb6300e42284f5721c31d"} Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.338033 4722 scope.go:117] "RemoveContainer" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.338187 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.347496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17b6c8b5-9711-4601-a0fd-a1f528e97287" (UID: "17b6c8b5-9711-4601-a0fd-a1f528e97287"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.348402 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.350026 4722 generic.go:334] "Generic (PLEG): container finished" podID="53444e7f-4c1d-401b-9896-5ff9c4aab65a" containerID="13563db377a2d356d6c5e051100eb3fdec737b3d97a75f97173241fd5519e50d" exitCode=0 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.351314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerDied","Data":"13563db377a2d356d6c5e051100eb3fdec737b3d97a75f97173241fd5519e50d"} Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.351345 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372612 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372756 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372815 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372864 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372875 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.420201 4722 scope.go:117] "RemoveContainer" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.420405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.423386 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config" (OuterVolumeSpecName: "config") pod "17b6c8b5-9711-4601-a0fd-a1f528e97287" (UID: "17b6c8b5-9711-4601-a0fd-a1f528e97287"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474620 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474831 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.475580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.475667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.475762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.477515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.485753 4722 scope.go:117] "RemoveContainer" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.491464 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8\": container with ID starting with 3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8 not found: ID does not exist" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.491507 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8"} err="failed to get container status \"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8\": rpc error: code = NotFound desc = could not find container \"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8\": container with ID starting with 3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8 not found: ID does not exist" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.491537 4722 scope.go:117] "RemoveContainer" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.491823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1\": container with ID starting with a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1 not found: ID does not exist" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.491854 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1"} err="failed to get container status \"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1\": rpc error: code = NotFound desc = could not find container \"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1\": container with ID starting with a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1 not found: ID does not exist" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.493051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.625787 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:13 crc kubenswrapper[4722]: W0219 19:37:13.627633 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec94b2bc_5f45_48da_aaba_8bd3e5c8e29b.slice/crio-ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9 WatchSource:0}: Error finding container ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9: Status 404 returned error can't find the container with id ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.650075 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.761939 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.783260 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.792759 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.796217 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.799465 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7rpl2" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.800048 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.801598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.801858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.807734 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.847980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-282bs"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886106 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-scripts\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-config\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnddz\" (UniqueName: \"kubernetes.io/projected/6f8e6f58-f989-41f2-b8cb-c798405cfa33-kube-api-access-fnddz\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.888258 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.888286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.988023 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.990662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.991177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.991213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.991280 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-scripts\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-config\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnddz\" (UniqueName: \"kubernetes.io/projected/6f8e6f58-f989-41f2-b8cb-c798405cfa33-kube-api-access-fnddz\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-config\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-scripts\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.010571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.021582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnddz\" (UniqueName: \"kubernetes.io/projected/6f8e6f58-f989-41f2-b8cb-c798405cfa33-kube-api-access-fnddz\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.093058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.096353 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.096436 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.102189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km" (OuterVolumeSpecName: "kube-api-access-kf2km") pod "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" (UID: "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465"). InnerVolumeSpecName "kube-api-access-kf2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.137208 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config" (OuterVolumeSpecName: "config") pod "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" (UID: "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.158823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" (UID: "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.198480 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.198514 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.198527 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.207931 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:14 crc kubenswrapper[4722]: W0219 19:37:14.214841 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfec288d_0744_48b4_8fcb_9ba349ebb6c4.slice/crio-b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5 WatchSource:0}: Error finding container b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5: Status 404 returned error can't find the container with id b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5 Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.285216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.360063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerStarted","Data":"b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.365741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-282bs" event={"ID":"9470e2b8-0f01-4735-8050-1bae363b3a02","Type":"ContainerStarted","Data":"cc076b1f283aca77291d9912f5fc1dc5f832fcaa4d23e0d84625301b815b2d2a"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.365873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-282bs" event={"ID":"9470e2b8-0f01-4735-8050-1bae363b3a02","Type":"ContainerStarted","Data":"475eba7baafd2537d18af499574b953cb465d02e6ead97fc6eef6dffd274ae95"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370257 4722 generic.go:334] "Generic (PLEG): container finished" podID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" exitCode=0 Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370469 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerDied","Data":"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerDied","Data":"f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.371001 4722 scope.go:117] "RemoveContainer" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.384422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerStarted","Data":"c497eb715405c49a41ac9c6c19dd91ccc2639f412bb662f551950a2a6a4f9593"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.387248 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-282bs" podStartSLOduration=2.387231502 podStartE2EDuration="2.387231502s" podCreationTimestamp="2026-02-19 19:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:14.384082544 +0000 UTC m=+1133.996432878" watchObservedRunningTime="2026-02-19 19:37:14.387231502 +0000 UTC m=+1133.999581826" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.407353 4722 generic.go:334] "Generic (PLEG): container finished" podID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" exitCode=0 Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.407587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerDied","Data":"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.407645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerStarted","Data":"ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.426443 4722 scope.go:117] "RemoveContainer" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.428232 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=33.894086704 podStartE2EDuration="50.428214397s" podCreationTimestamp="2026-02-19 19:36:24 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.315224916 +0000 UTC m=+1108.927575230" lastFinishedPulling="2026-02-19 19:37:05.849352599 +0000 UTC m=+1125.461702923" observedRunningTime="2026-02-19 19:37:14.426799814 +0000 UTC m=+1134.039150138" watchObservedRunningTime="2026-02-19 19:37:14.428214397 +0000 UTC m=+1134.040564721" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.468443 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.487315 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.514455 4722 scope.go:117] "RemoveContainer" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" Feb 19 19:37:14 crc kubenswrapper[4722]: E0219 19:37:14.514802 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434\": container with ID starting with a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434 not found: ID does not exist" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.514829 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434"} err="failed to get container status \"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434\": rpc error: code = NotFound desc = could not find container \"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434\": container with ID starting with a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434 not found: ID does not exist" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.514848 4722 scope.go:117] "RemoveContainer" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" Feb 19 19:37:14 crc kubenswrapper[4722]: E0219 19:37:14.515014 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97\": container with ID starting with 99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97 not found: ID does not exist" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.515033 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97"} err="failed to get container status \"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97\": rpc error: code = NotFound desc = could not find container \"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97\": container with ID starting with 99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97 not found: ID does not exist" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.768916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 19:37:14 crc kubenswrapper[4722]: W0219 19:37:14.771170 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f8e6f58_f989_41f2_b8cb_c798405cfa33.slice/crio-9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864 WatchSource:0}: Error finding container 9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864: Status 404 returned error can't find the container with id 9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864 Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.084357 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" path="/var/lib/kubelet/pods/17b6c8b5-9711-4601-a0fd-a1f528e97287/volumes" Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.085656 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" path="/var/lib/kubelet/pods/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465/volumes" Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.428690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6f8e6f58-f989-41f2-b8cb-c798405cfa33","Type":"ContainerStarted","Data":"9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864"} Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.430492 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerID="62cc34e349902eca38fc94fdcd77006a8905ea0cb9cbb3392c7d1c40da4629fc" exitCode=0 Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.430552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerDied","Data":"62cc34e349902eca38fc94fdcd77006a8905ea0cb9cbb3392c7d1c40da4629fc"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.063968 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.064344 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.442580 4722 generic.go:334] "Generic (PLEG): container finished" podID="78e7f1b1-be76-4f05-bd63-ff87b440e173" containerID="873d37ee99fb511c8da26dd67c2f29770d664f57f955aa7d18b5dc0f234df076" exitCode=0 Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.442825 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerDied","Data":"873d37ee99fb511c8da26dd67c2f29770d664f57f955aa7d18b5dc0f234df076"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.449394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerStarted","Data":"044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.450817 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.453375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerStarted","Data":"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.453803 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.455186 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51" exitCode=0 Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.455210 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.519354 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" podStartSLOduration=3.519337954 podStartE2EDuration="3.519337954s" podCreationTimestamp="2026-02-19 19:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:16.510749257 +0000 UTC m=+1136.123099581" watchObservedRunningTime="2026-02-19 19:37:16.519337954 +0000 UTC m=+1136.131688268" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.531833 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" podStartSLOduration=4.531811993 podStartE2EDuration="4.531811993s" podCreationTimestamp="2026-02-19 19:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:16.529533852 +0000 UTC m=+1136.141884186" watchObservedRunningTime="2026-02-19 19:37:16.531811993 +0000 UTC m=+1136.144162317" Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.465500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6f8e6f58-f989-41f2-b8cb-c798405cfa33","Type":"ContainerStarted","Data":"ccd7812cf29dbb3027cfbe13bca2d67873c62d4c527ff9b258bd53d7c9b1855c"} Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.465827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6f8e6f58-f989-41f2-b8cb-c798405cfa33","Type":"ContainerStarted","Data":"3a29aa6cae244726865ab666544baaa48e5eeb85a103b154e2fa0dfe70455672"} Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.481107 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.481405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.489792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.848085668 podStartE2EDuration="4.48977317s" podCreationTimestamp="2026-02-19 19:37:13 +0000 UTC" firstStartedPulling="2026-02-19 19:37:14.773846902 +0000 UTC m=+1134.386197226" lastFinishedPulling="2026-02-19 19:37:16.415534414 +0000 UTC m=+1136.027884728" observedRunningTime="2026-02-19 19:37:17.483945699 +0000 UTC m=+1137.096296093" watchObservedRunningTime="2026-02-19 19:37:17.48977317 +0000 UTC m=+1137.102123504" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.480128 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" event={"ID":"fc37f35d-ac2f-40a0-90e1-40c3b80b1782","Type":"ContainerStarted","Data":"86abf35ae7857af2662ed6f4bce3b72afb340a5c2b54a1e546605550ce412da0"} Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.480999 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.482281 4722 generic.go:334] "Generic (PLEG): container finished" podID="c8300e35-4c72-4398-9058-0aa76005d576" containerID="663d74fba80dd334e859cdb2cd8cfc2f7c0c52291c1501db2186e2522941c439" exitCode=0 Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.482369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerDied","Data":"663d74fba80dd334e859cdb2cd8cfc2f7c0c52291c1501db2186e2522941c439"} Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.483007 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.502427 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.506059 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" podStartSLOduration=10.302538347 podStartE2EDuration="38.506039102s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.946300432 +0000 UTC m=+1109.558650756" lastFinishedPulling="2026-02-19 19:37:18.149801187 +0000 UTC m=+1137.762151511" observedRunningTime="2026-02-19 19:37:18.50564163 +0000 UTC m=+1138.117991974" watchObservedRunningTime="2026-02-19 19:37:18.506039102 +0000 UTC m=+1138.118389426" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.777327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.920395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.315731 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.347647 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:37:19 crc kubenswrapper[4722]: E0219 19:37:19.349355 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="init" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.349396 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="init" Feb 19 19:37:19 crc kubenswrapper[4722]: E0219 19:37:19.349432 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.349443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.349733 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.350888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.384768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.498669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerStarted","Data":"3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2"} Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.498908 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.506497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerStarted","Data":"72b185aa4e6327015ed4749ec09a7270a569c75eac7f0279641fa9858918a81e"} Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.506920 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" containerID="cri-o://6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" gracePeriod=10 Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.515734 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.891697779 podStartE2EDuration="51.515716689s" podCreationTimestamp="2026-02-19 19:36:28 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.948352416 +0000 UTC m=+1109.560702740" lastFinishedPulling="2026-02-19 19:37:18.572371326 +0000 UTC m=+1138.184721650" observedRunningTime="2026-02-19 19:37:19.515525333 +0000 UTC m=+1139.127875657" watchObservedRunningTime="2026-02-19 19:37:19.515716689 +0000 UTC m=+1139.128067013" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526460 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526508 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526531 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.627959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.629708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.630010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.630099 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.631428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.657903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.670775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.785212 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.872228 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.363821 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.486966 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.488425 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="init" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.488451 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="init" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.488489 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.488498 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.489563 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.532805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.532944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.535980 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.536387 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.536439 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.540542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2g4z6" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.546029 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"95da56850b2ea7e8730ab7324347b55ee33efb05afb659de11188f775ecfa216"} Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.547909 4722 generic.go:334] "Generic (PLEG): container finished" podID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" exitCode=0 Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.548894 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.548984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerDied","Data":"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862"} Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549015 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerDied","Data":"ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9"} Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549034 4722 scope.go:117] "RemoveContainer" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549299 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.550723 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.550920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.555720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn" (OuterVolumeSpecName: "kube-api-access-kzgmn") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "kube-api-access-kzgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.607319 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config" (OuterVolumeSpecName: "config") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.615840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.622768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.628139 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.652791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-kube-api-access-v57xk\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.652839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-cache\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.652940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dc74a5-9538-49e4-9dd0-eb2735f18d41-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-lock\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653223 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653240 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653251 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653261 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755127 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dc74a5-9538-49e4-9dd0-eb2735f18d41-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-lock\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-lock\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.756121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-kube-api-access-v57xk\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.756182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-cache\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.756265 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.756454 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.756473 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.756520 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:21.256500557 +0000 UTC m=+1140.868850881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.757051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-cache\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.760281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dc74a5-9538-49e4-9dd0-eb2735f18d41-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.763480 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.763516 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/603a810ec5db859f322e09f708b74d3e59133a63d24627063418a6e2b2532b88/globalmount\"" pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.778686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-kube-api-access-v57xk\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.796957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.885384 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.892879 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.931743 4722 scope.go:117] "RemoveContainer" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" Feb 19 19:37:20 crc kubenswrapper[4722]: W0219 19:37:20.936266 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12e3334_cc75_47af_870a_3d86164cb249.slice/crio-1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325 WatchSource:0}: Error finding container 1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325: Status 404 returned error can't find the container with id 1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325 Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.952699 4722 scope.go:117] "RemoveContainer" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.953137 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862\": container with ID starting with 6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862 not found: ID does not exist" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.953189 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862"} err="failed to get container status \"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862\": rpc error: code = NotFound desc = could not find container \"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862\": container with ID starting with 6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862 not found: ID does not exist" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.953223 4722 scope.go:117] "RemoveContainer" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.953570 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94\": container with ID starting with a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94 not found: ID does not exist" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.953636 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94"} err="failed to get container status \"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94\": rpc error: code = NotFound desc = could not find container \"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94\": container with ID starting with a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94 not found: ID does not exist" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.101810 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" path="/var/lib/kubelet/pods/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b/volumes" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.270614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:21 crc kubenswrapper[4722]: E0219 19:37:21.271061 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:21 crc kubenswrapper[4722]: E0219 19:37:21.271077 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:21 crc kubenswrapper[4722]: E0219 19:37:21.271120 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:22.271106519 +0000 UTC m=+1141.883456833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.559367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" event={"ID":"47cbe0b4-7d45-486b-9e9b-964db524e7ab","Type":"ContainerStarted","Data":"96dae1aa32b302d02a862017213278a24313e48e8ece75a386cfd1ad66863741"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.560358 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.564095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerStarted","Data":"09dc4a18c8e5633c815e20e2a05190540e7081aa9ea19714f54e4871e3e23e07"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.564259 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.570031 4722 generic.go:334] "Generic (PLEG): container finished" podID="b12e3334-cc75-47af-870a-3d86164cb249" containerID="58f8459d38255bc0ee2a3b1d7c9b5ab8e43bfd9e3de2e5dd8ef6021c2a7233ed" exitCode=0 Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.570070 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerDied","Data":"58f8459d38255bc0ee2a3b1d7c9b5ab8e43bfd9e3de2e5dd8ef6021c2a7233ed"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.570093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerStarted","Data":"1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.572794 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.582023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" podStartSLOduration=11.877379129 podStartE2EDuration="41.581972752s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.429975292 +0000 UTC m=+1110.042325616" lastFinishedPulling="2026-02-19 19:37:20.134568905 +0000 UTC m=+1139.746919239" observedRunningTime="2026-02-19 19:37:21.577877054 +0000 UTC m=+1141.190227388" watchObservedRunningTime="2026-02-19 19:37:21.581972752 +0000 UTC m=+1141.194323066" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.899839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.917360 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fwvrs" podStartSLOduration=22.132110712 podStartE2EDuration="48.917345877s" podCreationTimestamp="2026-02-19 19:36:33 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.696453564 +0000 UTC m=+1110.308803898" lastFinishedPulling="2026-02-19 19:37:17.481688699 +0000 UTC m=+1137.094039063" observedRunningTime="2026-02-19 19:37:21.669481135 +0000 UTC m=+1141.281831459" watchObservedRunningTime="2026-02-19 19:37:21.917345877 +0000 UTC m=+1141.529696201" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.298485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:22 crc kubenswrapper[4722]: E0219 19:37:22.298727 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:22 crc kubenswrapper[4722]: E0219 19:37:22.298889 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:22 crc kubenswrapper[4722]: E0219 19:37:22.298946 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:24.298931561 +0000 UTC m=+1143.911281885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.500120 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.501202 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.503544 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.516899 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.542554 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.544255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.548564 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.580097 4722 generic.go:334] "Generic (PLEG): container finished" podID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerID="c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2" exitCode=0 Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.580189 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerDied","Data":"c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2"} Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.582952 4722 generic.go:334] "Generic (PLEG): container finished" podID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerID="e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d" exitCode=0 Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.583006 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerDied","Data":"e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d"} Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.585251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerStarted","Data":"acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3"} Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.585550 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.607298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.607381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.652033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podStartSLOduration=3.652015328 podStartE2EDuration="3.652015328s" podCreationTimestamp="2026-02-19 19:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:22.651826912 +0000 UTC m=+1142.264177246" watchObservedRunningTime="2026-02-19 19:37:22.652015328 +0000 UTC m=+1142.264365662" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.713315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.713418 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.713497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.714475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.715680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.737831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.817237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.817368 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.818358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.821740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.841589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.861083 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:23 crc kubenswrapper[4722]: I0219 19:37:23.593286 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:23 crc kubenswrapper[4722]: I0219 19:37:23.652521 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.347473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:24 crc kubenswrapper[4722]: E0219 19:37:24.347997 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:24 crc kubenswrapper[4722]: E0219 19:37:24.348030 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:24 crc kubenswrapper[4722]: E0219 19:37:24.348123 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:28.348067041 +0000 UTC m=+1147.960417365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.368967 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.370523 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.372309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.393467 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.445705 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-q5fhk"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.446787 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449109 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449580 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449842 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.480731 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-q5fhk"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551634 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.552790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.581271 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.609661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"77e650597cadcbbcf8f49dc2aec43ffb45cc5d2a44416a9317b89a31c13904b9"} Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.636888 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=24.805993461 podStartE2EDuration="55.636868528s" podCreationTimestamp="2026-02-19 19:36:29 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.303291325 +0000 UTC m=+1108.915641649" lastFinishedPulling="2026-02-19 19:37:20.134166392 +0000 UTC m=+1139.746516716" observedRunningTime="2026-02-19 19:37:24.62857029 +0000 UTC m=+1144.240920624" watchObservedRunningTime="2026-02-19 19:37:24.636868528 +0000 UTC m=+1144.249218852" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.653213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654142 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.655728 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.655886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.657244 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.658141 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.658724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.671266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.715313 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.768807 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2g4z6" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.776171 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.961602 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.965233 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.189253 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.318642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:37:25 crc kubenswrapper[4722]: W0219 19:37:25.342434 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f940a76_c93f_46c5_af29_5b098a54adc8.slice/crio-35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe WatchSource:0}: Error finding container 35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe: Status 404 returned error can't find the container with id 35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.431431 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-q5fhk"] Feb 19 19:37:25 crc kubenswrapper[4722]: W0219 19:37:25.444905 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc81edb08_7ac8_4cfc_abce_5895b8e7b59b.slice/crio-54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924 WatchSource:0}: Error finding container 54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924: Status 404 returned error can't find the container with id 54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924 Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.515795 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:25 crc kubenswrapper[4722]: W0219 19:37:25.518132 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd26936c_cebb_4507_92cb_45c7af5b7762.slice/crio-bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40 WatchSource:0}: Error finding container bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40: Status 404 returned error can't find the container with id bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40 Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.618407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hlljb" event={"ID":"fd26936c-cebb-4507-92cb-45c7af5b7762","Type":"ContainerStarted","Data":"bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.619951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerStarted","Data":"d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.619988 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerStarted","Data":"d7437c095cf48c8adc0f2290a63f74e82e909327bc7acc88e8bdea32256fc6c2"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.622294 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerStarted","Data":"fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.622513 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.623660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerStarted","Data":"54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.625527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerStarted","Data":"37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.626529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.638749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.640899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerStarted","Data":"65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.640951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerStarted","Data":"35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.652952 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1f02-account-create-update-cslgg" podStartSLOduration=3.6529367539999997 podStartE2EDuration="3.652936754s" podCreationTimestamp="2026-02-19 19:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:25.644547752 +0000 UTC m=+1145.256898086" watchObservedRunningTime="2026-02-19 19:37:25.652936754 +0000 UTC m=+1145.265287068" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.676792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.885012153 podStartE2EDuration="1m2.676778715s" podCreationTimestamp="2026-02-19 19:36:23 +0000 UTC" firstStartedPulling="2026-02-19 19:36:32.770795164 +0000 UTC m=+1092.383145488" lastFinishedPulling="2026-02-19 19:36:48.562561726 +0000 UTC m=+1108.174912050" observedRunningTime="2026-02-19 19:37:25.668504347 +0000 UTC m=+1145.280854691" watchObservedRunningTime="2026-02-19 19:37:25.676778715 +0000 UTC m=+1145.289129039" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.707814 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.333951554 podStartE2EDuration="1m3.70780039s" podCreationTimestamp="2026-02-19 19:36:22 +0000 UTC" firstStartedPulling="2026-02-19 19:36:24.866077622 +0000 UTC m=+1084.478427946" lastFinishedPulling="2026-02-19 19:36:48.239926458 +0000 UTC m=+1107.852276782" observedRunningTime="2026-02-19 19:37:25.704937311 +0000 UTC m=+1145.317287655" watchObservedRunningTime="2026-02-19 19:37:25.70780039 +0000 UTC m=+1145.320150714" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.729320 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-5m87g" podStartSLOduration=3.72930167 podStartE2EDuration="3.72930167s" podCreationTimestamp="2026-02-19 19:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:25.720893128 +0000 UTC m=+1145.333243472" watchObservedRunningTime="2026-02-19 19:37:25.72930167 +0000 UTC m=+1145.341651994" Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.653360 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerID="65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4" exitCode=0 Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.653405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerDied","Data":"65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4"} Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.655854 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerID="ec6e9a5d8db1ce9bec823742a602001b48238109f03304859ab2fe4f5a1aeb10" exitCode=0 Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.655914 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hlljb" event={"ID":"fd26936c-cebb-4507-92cb-45c7af5b7762","Type":"ContainerDied","Data":"ec6e9a5d8db1ce9bec823742a602001b48238109f03304859ab2fe4f5a1aeb10"} Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.657850 4722 generic.go:334] "Generic (PLEG): container finished" podID="03387e77-59d8-4377-9a1c-dac948d84b59" containerID="d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538" exitCode=0 Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.657929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerDied","Data":"d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.114063 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.212902 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.213360 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerName="mariadb-account-create-update" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.213376 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerName="mariadb-account-create-update" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.213571 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerName="mariadb-account-create-update" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.214276 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.231503 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.233555 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"fd26936c-cebb-4507-92cb-45c7af5b7762\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.233740 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"fd26936c-cebb-4507-92cb-45c7af5b7762\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.234822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd26936c-cebb-4507-92cb-45c7af5b7762" (UID: "fd26936c-cebb-4507-92cb-45c7af5b7762"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.235968 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.242619 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf" (OuterVolumeSpecName: "kube-api-access-h2fkf") pod "fd26936c-cebb-4507-92cb-45c7af5b7762" (UID: "fd26936c-cebb-4507-92cb-45c7af5b7762"). InnerVolumeSpecName "kube-api-access-h2fkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.308818 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.309209 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerName="mariadb-database-create" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.309225 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerName="mariadb-database-create" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.309381 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerName="mariadb-database-create" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.310001 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.311758 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.328189 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"1f940a76-c93f-46c5-af29-5b098a54adc8\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335338 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"1f940a76-c93f-46c5-af29-5b098a54adc8\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.336041 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.336060 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.336288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f940a76-c93f-46c5-af29-5b098a54adc8" (UID: "1f940a76-c93f-46c5-af29-5b098a54adc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.339412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr" (OuterVolumeSpecName: "kube-api-access-88jsr") pod "1f940a76-c93f-46c5-af29-5b098a54adc8" (UID: "1f940a76-c93f-46c5-af29-5b098a54adc8"). InnerVolumeSpecName "kube-api-access-88jsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.436997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437212 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437228 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.439101 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.439121 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.439197 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:36.43918319 +0000 UTC m=+1156.051533524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.453366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.494548 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.495708 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.510386 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.511589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.512899 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.519061 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.528616 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.539048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.560588 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.585806 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.624897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.641567 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.659564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.678173 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.679977 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerDied","Data":"35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.680001 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.680050 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.683116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hlljb" event={"ID":"fd26936c-cebb-4507-92cb-45c7af5b7762","Type":"ContainerDied","Data":"bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.683182 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.683204 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.742794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.743385 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.744214 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.787903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.812091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.828600 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.368535 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.643929 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.673358 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.738056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerDied","Data":"d7437c095cf48c8adc0f2290a63f74e82e909327bc7acc88e8bdea32256fc6c2"} Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.738107 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7437c095cf48c8adc0f2290a63f74e82e909327bc7acc88e8bdea32256fc6c2" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.738268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.741426 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.741617 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" containerID="cri-o://044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7" gracePeriod=10 Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.761662 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"03387e77-59d8-4377-9a1c-dac948d84b59\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.761821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"03387e77-59d8-4377-9a1c-dac948d84b59\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.762225 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03387e77-59d8-4377-9a1c-dac948d84b59" (UID: "03387e77-59d8-4377-9a1c-dac948d84b59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.762431 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.768438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2" (OuterVolumeSpecName: "kube-api-access-8j4s2") pod "03387e77-59d8-4377-9a1c-dac948d84b59" (UID: "03387e77-59d8-4377-9a1c-dac948d84b59"). InnerVolumeSpecName "kube-api-access-8j4s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.864399 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.483822 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.747609 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerID="044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7" exitCode=0 Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.747663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerDied","Data":"044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7"} Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.760472 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.813837 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.870403 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.889431 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:31 crc kubenswrapper[4722]: I0219 19:37:31.087813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" path="/var/lib/kubelet/pods/fd26936c-cebb-4507-92cb-45c7af5b7762/volumes" Feb 19 19:37:31 crc kubenswrapper[4722]: I0219 19:37:31.757735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:37:31 crc kubenswrapper[4722]: I0219 19:37:31.951583 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.725888 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:37:32 crc kubenswrapper[4722]: E0219 19:37:32.726393 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" containerName="mariadb-account-create-update" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.726409 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" containerName="mariadb-account-create-update" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.726656 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" containerName="mariadb-account-create-update" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.727419 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.731120 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.731625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s8kl" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.771955 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.807743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerDied","Data":"b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5"} Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.808100 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.821389 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.833729 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.833792 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.833981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.834330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.936468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.936595 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.936655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937131 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937281 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.938050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.938090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.946540 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227" (OuterVolumeSpecName: "kube-api-access-cw227") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "kube-api-access-cw227". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.947443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.950066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.956222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.970670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.032521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.040369 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.040410 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.043789 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config" (OuterVolumeSpecName: "config") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.056893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.077089 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.089606 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.146523 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.146564 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.146577 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.187455 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.233055 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.498784 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.514065 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.799555 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.816047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2439-account-create-update-lqmn5" event={"ID":"44afb335-8449-4492-a772-78889877810e","Type":"ContainerStarted","Data":"f892930a6a254a604ca1f62b197a46b6f4adf5d1a23ff675a6f4dceae3710829"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.817514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqnqr" event={"ID":"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358","Type":"ContainerStarted","Data":"1eb7fc48bd8b72e0b5c8cc94587a61101731402ad9ff8b02d1b1e5d0d69c49be"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.818973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerStarted","Data":"827ec543eff6496863bdbf6ae3908b628e0d5862787c9446d39fe5652d9dbfa4"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.820603 4722 generic.go:334] "Generic (PLEG): container finished" podID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerID="99c98b71002ac8948511844b6989a0da14ae66e034112843908355f3a72c44e7" exitCode=0 Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.820674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4b7g9" event={"ID":"5bd3ad13-0324-4c1c-9b74-eb1401f06507","Type":"ContainerDied","Data":"99c98b71002ac8948511844b6989a0da14ae66e034112843908355f3a72c44e7"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.820699 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4b7g9" event={"ID":"5bd3ad13-0324-4c1c-9b74-eb1401f06507","Type":"ContainerStarted","Data":"5b71ad4ca4512f9223138f370e3a52a25097fcb37a438f94f9b1595c0fb1c496"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.821858 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c526-account-create-update-lmx4k" event={"ID":"93536b6f-8176-4737-a547-9face2995981","Type":"ContainerStarted","Data":"d43cf646287fe537785df6cf6532f9d6502d5c80eb8ccdd82b930f04b64f53a1"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.825180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.825191 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.840465 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-q5fhk" podStartSLOduration=2.654601531 podStartE2EDuration="9.840442164s" podCreationTimestamp="2026-02-19 19:37:24 +0000 UTC" firstStartedPulling="2026-02-19 19:37:25.447688407 +0000 UTC m=+1145.060038731" lastFinishedPulling="2026-02-19 19:37:32.63352904 +0000 UTC m=+1152.245879364" observedRunningTime="2026-02-19 19:37:33.835074377 +0000 UTC m=+1153.447424701" watchObservedRunningTime="2026-02-19 19:37:33.840442164 +0000 UTC m=+1153.452792488" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.889409 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.193164119 podStartE2EDuration="1m4.889391337s" podCreationTimestamp="2026-02-19 19:36:29 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.993832881 +0000 UTC m=+1109.606183205" lastFinishedPulling="2026-02-19 19:37:32.690060099 +0000 UTC m=+1152.302410423" observedRunningTime="2026-02-19 19:37:33.8856407 +0000 UTC m=+1153.497991034" watchObservedRunningTime="2026-02-19 19:37:33.889391337 +0000 UTC m=+1153.501741661" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.915982 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.922190 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.129743 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.361977 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.771894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.847463 4722 generic.go:334] "Generic (PLEG): container finished" podID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerID="2e209875892b5272f7bb00341b24fa8e6b2be48cf1bccfa8acb4859e6aeca425" exitCode=0 Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.847596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqnqr" event={"ID":"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358","Type":"ContainerDied","Data":"2e209875892b5272f7bb00341b24fa8e6b2be48cf1bccfa8acb4859e6aeca425"} Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.850464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerStarted","Data":"f0c405b64fff456aecf84e0cb3dfbb788e3a93a4e01a434dac62f40edc004d0e"} Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.853084 4722 generic.go:334] "Generic (PLEG): container finished" podID="93536b6f-8176-4737-a547-9face2995981" containerID="687c2f6cd621666c11c3a553d69b13af20c5311d98a27db188d1d7153219352e" exitCode=0 Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.853193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c526-account-create-update-lmx4k" event={"ID":"93536b6f-8176-4737-a547-9face2995981","Type":"ContainerDied","Data":"687c2f6cd621666c11c3a553d69b13af20c5311d98a27db188d1d7153219352e"} Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.855576 4722 generic.go:334] "Generic (PLEG): container finished" podID="44afb335-8449-4492-a772-78889877810e" containerID="c2f010a6f9fb7a90aca42363ebf34cb5a6a44700de8e1351f8ac807b74981bd2" exitCode=0 Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.855878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2439-account-create-update-lqmn5" event={"ID":"44afb335-8449-4492-a772-78889877810e","Type":"ContainerDied","Data":"c2f010a6f9fb7a90aca42363ebf34cb5a6a44700de8e1351f8ac807b74981bd2"} Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.085762 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" path="/var/lib/kubelet/pods/dfec288d-0744-48b4-8fcb-9ba349ebb6c4/volumes" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.231173 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.292985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.293213 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.293726 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bd3ad13-0324-4c1c-9b74-eb1401f06507" (UID: "5bd3ad13-0324-4c1c-9b74-eb1401f06507"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.298960 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq" (OuterVolumeSpecName: "kube-api-access-9glrq") pod "5bd3ad13-0324-4c1c-9b74-eb1401f06507" (UID: "5bd3ad13-0324-4c1c-9b74-eb1401f06507"). InnerVolumeSpecName "kube-api-access-9glrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.394741 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.394783 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868144 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4b7g9" event={"ID":"5bd3ad13-0324-4c1c-9b74-eb1401f06507","Type":"ContainerDied","Data":"5b71ad4ca4512f9223138f370e3a52a25097fcb37a438f94f9b1595c0fb1c496"} Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868694 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b71ad4ca4512f9223138f370e3a52a25097fcb37a438f94f9b1595c0fb1c496" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868926 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:37:35 crc kubenswrapper[4722]: E0219 19:37:35.869477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869508 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" Feb 19 19:37:35 crc kubenswrapper[4722]: E0219 19:37:35.869557 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerName="mariadb-database-create" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869571 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerName="mariadb-database-create" Feb 19 19:37:35 crc kubenswrapper[4722]: E0219 19:37:35.869597 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="init" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869614 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="init" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869925 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerName="mariadb-database-create" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869970 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.871018 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.882395 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.886965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.905348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.905386 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.944049 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.007582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.007952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.010301 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.042850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.197423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.247802 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.323196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"93536b6f-8176-4737-a547-9face2995981\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.323341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"93536b6f-8176-4737-a547-9face2995981\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.325435 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93536b6f-8176-4737-a547-9face2995981" (UID: "93536b6f-8176-4737-a547-9face2995981"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.330005 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln" (OuterVolumeSpecName: "kube-api-access-8l4ln") pod "93536b6f-8176-4737-a547-9face2995981" (UID: "93536b6f-8176-4737-a547-9face2995981"). InnerVolumeSpecName "kube-api-access-8l4ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.426327 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.426357 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.508766 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.517417 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"44afb335-8449-4492-a772-78889877810e\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529837 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"44afb335-8449-4492-a772-78889877810e\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529907 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.530230 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:36 crc kubenswrapper[4722]: E0219 19:37:36.530463 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:36 crc kubenswrapper[4722]: E0219 19:37:36.530481 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:36 crc kubenswrapper[4722]: E0219 19:37:36.530527 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:52.530511337 +0000 UTC m=+1172.142861661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.530815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" (UID: "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.531298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44afb335-8449-4492-a772-78889877810e" (UID: "44afb335-8449-4492-a772-78889877810e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.536339 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg" (OuterVolumeSpecName: "kube-api-access-28cwg") pod "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" (UID: "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358"). InnerVolumeSpecName "kube-api-access-28cwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.542379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j" (OuterVolumeSpecName: "kube-api-access-xt78j") pod "44afb335-8449-4492-a772-78889877810e" (UID: "44afb335-8449-4492-a772-78889877810e"). InnerVolumeSpecName "kube-api-access-xt78j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639237 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639281 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639292 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639299 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.880412 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.880665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2439-account-create-update-lqmn5" event={"ID":"44afb335-8449-4492-a772-78889877810e","Type":"ContainerDied","Data":"f892930a6a254a604ca1f62b197a46b6f4adf5d1a23ff675a6f4dceae3710829"} Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.880709 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f892930a6a254a604ca1f62b197a46b6f4adf5d1a23ff675a6f4dceae3710829" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.883661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqnqr" event={"ID":"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358","Type":"ContainerDied","Data":"1eb7fc48bd8b72e0b5c8cc94587a61101731402ad9ff8b02d1b1e5d0d69c49be"} Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.883697 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb7fc48bd8b72e0b5c8cc94587a61101731402ad9ff8b02d1b1e5d0d69c49be" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.883769 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.893073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c526-account-create-update-lmx4k" event={"ID":"93536b6f-8176-4737-a547-9face2995981","Type":"ContainerDied","Data":"d43cf646287fe537785df6cf6532f9d6502d5c80eb8ccdd82b930f04b64f53a1"} Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.893111 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43cf646287fe537785df6cf6532f9d6502d5c80eb8ccdd82b930f04b64f53a1" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.893198 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.903750 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:37:36 crc kubenswrapper[4722]: W0219 19:37:36.912960 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fc542bf_bcc1_48b0_b0d9_a1c4e2702cc8.slice/crio-c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7 WatchSource:0}: Error finding container c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7: Status 404 returned error can't find the container with id c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7 Feb 19 19:37:37 crc kubenswrapper[4722]: I0219 19:37:37.903825 4722 generic.go:334] "Generic (PLEG): container finished" podID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerID="8aa3bea30fad3f939a077228a9ed1250c050038afc03ce315c796a876ab91692" exitCode=0 Feb 19 19:37:37 crc kubenswrapper[4722]: I0219 19:37:37.903887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2wnjc" event={"ID":"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8","Type":"ContainerDied","Data":"8aa3bea30fad3f939a077228a9ed1250c050038afc03ce315c796a876ab91692"} Feb 19 19:37:37 crc kubenswrapper[4722]: I0219 19:37:37.904110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2wnjc" event={"ID":"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8","Type":"ContainerStarted","Data":"c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7"} Feb 19 19:37:38 crc kubenswrapper[4722]: I0219 19:37:38.496323 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:38 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:38 crc kubenswrapper[4722]: > Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.321070 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.392437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.392764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.393203 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" (UID: "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.397405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx" (OuterVolumeSpecName: "kube-api-access-x2gtx") pod "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" (UID: "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8"). InnerVolumeSpecName "kube-api-access-x2gtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.496280 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.496623 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.920695 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2wnjc" event={"ID":"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8","Type":"ContainerDied","Data":"c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7"} Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.920729 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.920778 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:40 crc kubenswrapper[4722]: I0219 19:37:40.931808 4722 generic.go:334] "Generic (PLEG): container finished" podID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerID="827ec543eff6496863bdbf6ae3908b628e0d5862787c9446d39fe5652d9dbfa4" exitCode=0 Feb 19 19:37:40 crc kubenswrapper[4722]: I0219 19:37:40.931887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerDied","Data":"827ec543eff6496863bdbf6ae3908b628e0d5862787c9446d39fe5652d9dbfa4"} Feb 19 19:37:41 crc kubenswrapper[4722]: I0219 19:37:41.954040 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 19:37:43 crc kubenswrapper[4722]: I0219 19:37:43.500248 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:43 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:43 crc kubenswrapper[4722]: > Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.130354 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480016 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480417 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerName="mariadb-database-create" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480440 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerName="mariadb-database-create" Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480468 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93536b6f-8176-4737-a547-9face2995981" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480478 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="93536b6f-8176-4737-a547-9face2995981" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480488 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480494 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480514 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44afb335-8449-4492-a772-78889877810e" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480521 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44afb335-8449-4492-a772-78889877810e" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480667 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480684 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="93536b6f-8176-4737-a547-9face2995981" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480698 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerName="mariadb-database-create" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480708 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44afb335-8449-4492-a772-78889877810e" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.482789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.491221 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.580556 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.581745 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.591192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.591236 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.596879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.598445 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.605294 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.607957 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.646218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692763 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.693929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.695818 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.697213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.700879 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.712335 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.730970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.767489 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.768885 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.772879 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.773008 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.773186 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.773292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.781714 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.793870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.793918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.793944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794053 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794133 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.795034 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.809656 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.815023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.815446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.871946 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.872988 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.887898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.901469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.901986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.902161 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.902333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.902388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.903319 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.906204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.917722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.923886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.991043 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.994937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004332 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.012843 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.013716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.013864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.019239 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.020420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.023523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.028905 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.032693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.042680 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.096755 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.102017 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.103315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105489 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105721 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.107240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.116532 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.129949 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.202541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208323 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.209215 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.226278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310844 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.311992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.312651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.326530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.329022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.363166 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.375078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.421815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.942923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.946270 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.979941 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:46 crc kubenswrapper[4722]: I0219 19:37:46.942944 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.019962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerDied","Data":"54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924"} Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.020003 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.020021 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.048839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049020 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049092 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049253 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.050128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.050488 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.056586 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw" (OuterVolumeSpecName: "kube-api-access-hnhgw") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "kube-api-access-hnhgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.063285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.106322 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts" (OuterVolumeSpecName: "scripts") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.122903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.145653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154305 4722 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154343 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154352 4722 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154362 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154372 4722 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154382 4722 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154400 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: E0219 19:37:47.558269 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc81edb08_7ac8_4cfc_abce_5895b8e7b59b.slice\": RecentStats: unable to find data in memory cache]" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.741292 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.753234 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.800668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.814113 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.030336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerStarted","Data":"bab4e0dcd47bed11b26a97a238fcb572193e857fe8e5670dfa59d566460783b1"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.031813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-nqj2r" event={"ID":"2039a569-0bc4-49a4-9e82-08964729dc7b","Type":"ContainerStarted","Data":"ddb368090d84549e8613e6a8bf09662a248b3cdae6696eb51f4b8a9270abb3bd"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.033440 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-36cd-account-create-update-r5498" event={"ID":"a8d81d51-f4b7-4dec-9548-982de19b4742","Type":"ContainerStarted","Data":"edc9e09b1a3536dad44773c79d12736ed4976a1f27a0aa500ba378c707d81315"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.034937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerStarted","Data":"6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.038499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerStarted","Data":"42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.038546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerStarted","Data":"51c1d600bd28ddd30eb97781aa9e73764604da69509764e840b9fa1cd8a6698a"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.054662 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8fd9q" podStartSLOduration=2.7591183790000002 podStartE2EDuration="16.05463495s" podCreationTimestamp="2026-02-19 19:37:32 +0000 UTC" firstStartedPulling="2026-02-19 19:37:33.835691286 +0000 UTC m=+1153.448041610" lastFinishedPulling="2026-02-19 19:37:47.131207867 +0000 UTC m=+1166.743558181" observedRunningTime="2026-02-19 19:37:48.045777295 +0000 UTC m=+1167.658127619" watchObservedRunningTime="2026-02-19 19:37:48.05463495 +0000 UTC m=+1167.666985274" Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.065604 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-j7hfg" podStartSLOduration=4.065583101 podStartE2EDuration="4.065583101s" podCreationTimestamp="2026-02-19 19:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:48.0620354 +0000 UTC m=+1167.674385724" watchObservedRunningTime="2026-02-19 19:37:48.065583101 +0000 UTC m=+1167.677933425" Feb 19 19:37:48 crc kubenswrapper[4722]: W0219 19:37:48.112082 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe445148_46c0_4e8c_844a_51a5ce323370.slice/crio-8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b WatchSource:0}: Error finding container 8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b: Status 404 returned error can't find the container with id 8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b Feb 19 19:37:48 crc kubenswrapper[4722]: W0219 19:37:48.112640 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc78d063e_7cd7_4b41_b148_1a7f9a3f9914.slice/crio-fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd WatchSource:0}: Error finding container fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd: Status 404 returned error can't find the container with id fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.117652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.128257 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.140842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.153032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.168310 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.448037 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.448750 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" containerID="cri-o://572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f" gracePeriod=600 Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.449213 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" containerID="cri-o://efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d" gracePeriod=600 Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.449283 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" containerID="cri-o://c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821" gracePeriod=600 Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.523525 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:48 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:48 crc kubenswrapper[4722]: > Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.055287 4722 generic.go:334] "Generic (PLEG): container finished" podID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerID="a4f4b237835194ac1fcedd350c7532fc74f42e672c498f5c9cea05272f6986a0" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.055481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-nqj2r" event={"ID":"2039a569-0bc4-49a4-9e82-08964729dc7b","Type":"ContainerDied","Data":"a4f4b237835194ac1fcedd350c7532fc74f42e672c498f5c9cea05272f6986a0"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.059364 4722 generic.go:334] "Generic (PLEG): container finished" podID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerID="bb275fbcbbe35a94955e26075778ab6128134f99af8b8d18b788e7b11aac61c6" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.059541 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-36cd-account-create-update-r5498" event={"ID":"a8d81d51-f4b7-4dec-9548-982de19b4742","Type":"ContainerDied","Data":"bb275fbcbbe35a94955e26075778ab6128134f99af8b8d18b788e7b11aac61c6"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.064674 4722 generic.go:334] "Generic (PLEG): container finished" podID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerID="42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.064775 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerDied","Data":"42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.067835 4722 generic.go:334] "Generic (PLEG): container finished" podID="25905c52-4074-40d4-826f-ef89353eeaa6" containerID="6c2e2442beaae76dbd599637b272c7eae6a58710a3bb17eed3e61507df9ea9e0" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.067915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-89e2-account-create-update-7656w" event={"ID":"25905c52-4074-40d4-826f-ef89353eeaa6","Type":"ContainerDied","Data":"6c2e2442beaae76dbd599637b272c7eae6a58710a3bb17eed3e61507df9ea9e0"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.067938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-89e2-account-create-update-7656w" event={"ID":"25905c52-4074-40d4-826f-ef89353eeaa6","Type":"ContainerStarted","Data":"1a6bf93e45faaf4db27d8e2c20d1f7fc553e5da9aa844b25ea2e3a9760a7bce6"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.078259 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.078293 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.078308 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.100506 4722 generic.go:334] "Generic (PLEG): container finished" podID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerID="6d49fd861306d1a47364956e09d02157a9618a565198ef080d63694bf02fdc31" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105338 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerID="8c73c8e1b7d4896f7ab7a5272b3c22c63e7d90ad3033ca9be834b667cd882b7f" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105884 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105895 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0f99-account-create-update-fflhf" event={"ID":"217ea569-e058-4f21-bbb7-d2f2648375eb","Type":"ContainerDied","Data":"6d49fd861306d1a47364956e09d02157a9618a565198ef080d63694bf02fdc31"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0f99-account-create-update-fflhf" event={"ID":"217ea569-e058-4f21-bbb7-d2f2648375eb","Type":"ContainerStarted","Data":"7430b538256ae44d6944b8b2a907a2e6c7bd0b62c3823fced29b96d85a5d5b4f"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7kcsc" event={"ID":"d5778eec-eb7e-4137-85bd-761ac78b9fd7","Type":"ContainerDied","Data":"8c73c8e1b7d4896f7ab7a5272b3c22c63e7d90ad3033ca9be834b667cd882b7f"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7kcsc" event={"ID":"d5778eec-eb7e-4137-85bd-761ac78b9fd7","Type":"ContainerStarted","Data":"414b6e4e860136d1c415be1caf745a8eda79544b985daf7b493e0eacf866bbda"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.112646 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe445148-46c0-4e8c-844a-51a5ce323370" containerID="bf1fddeb0ef2831ba2e02a1aa709a530121f690fbf768791dd2408b9c18e9009" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.112819 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-67cbt" event={"ID":"fe445148-46c0-4e8c-844a-51a5ce323370","Type":"ContainerDied","Data":"bf1fddeb0ef2831ba2e02a1aa709a530121f690fbf768791dd2408b9c18e9009"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.112856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-67cbt" event={"ID":"fe445148-46c0-4e8c-844a-51a5ce323370","Type":"ContainerStarted","Data":"8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.115121 4722 generic.go:334] "Generic (PLEG): container finished" podID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerID="24deafb2187b5509b9a503b5cde68eab414e437eef2f36f8141214811c39e398" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.115201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eefc-account-create-update-h8n6c" event={"ID":"c78d063e-7cd7-4b41-b148-1a7f9a3f9914","Type":"ContainerDied","Data":"24deafb2187b5509b9a503b5cde68eab414e437eef2f36f8141214811c39e398"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.115228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eefc-account-create-update-h8n6c" event={"ID":"c78d063e-7cd7-4b41-b148-1a7f9a3f9914","Type":"ContainerStarted","Data":"fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.546549 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704399 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704448 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704585 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704637 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704663 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704714 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704780 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.705036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.705473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.705693 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.707916 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.713297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out" (OuterVolumeSpecName: "config-out") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.714073 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.718844 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config" (OuterVolumeSpecName: "config") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.719363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.731401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5" (OuterVolumeSpecName: "kube-api-access-nrjf5") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "kube-api-access-nrjf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.738953 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.779367 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config" (OuterVolumeSpecName: "web-config") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810588 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") on node \"crc\" " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810630 4722 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810645 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810666 4722 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810680 4722 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810690 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810703 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810715 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810728 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810737 4722 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.832431 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.832688 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3") on node "crc" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.913094 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.129498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5"} Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.129597 4722 scope.go:117] "RemoveContainer" containerID="efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.129791 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.198209 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.216123 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.251244 4722 scope.go:117] "RemoveContainer" containerID="c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.251382 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252686 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252716 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252724 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252736 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerName="swift-ring-rebalance" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerName="swift-ring-rebalance" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252757 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252763 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252772 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="init-config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252777 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="init-config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252957 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerName="swift-ring-rebalance" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252991 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.253007 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.253026 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.255160 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261355 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261918 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.262046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.262163 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.270112 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.284657 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9sq" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.284970 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.289629 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.345987 4722 scope.go:117] "RemoveContainer" containerID="572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.347893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.347934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.347980 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m6hm\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-kube-api-access-9m6hm\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348114 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348251 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348271 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.386449 4722 scope.go:117] "RemoveContainer" containerID="a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450341 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450430 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m6hm\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-kube-api-access-9m6hm\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.455810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.457136 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.458364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.459048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.462616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.464388 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.464430 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/991d7114ade43d3df67520db88811056b16c48c5086e58d4724863cd9821be9f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.465508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.467927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.470794 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.473588 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.473935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.494065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m6hm\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-kube-api-access-9m6hm\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.575834 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.587722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.649794 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.763749 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.763998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.765824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" (UID: "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.770308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b" (OuterVolumeSpecName: "kube-api-access-tks5b") pod "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" (UID: "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92"). InnerVolumeSpecName "kube-api-access-tks5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.866144 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.866542 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.091880 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" path="/var/lib/kubelet/pods/a08df2e8-3f03-4e9c-91cf-2890026b9d76/volumes" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.146075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerDied","Data":"51c1d600bd28ddd30eb97781aa9e73764604da69509764e840b9fa1cd8a6698a"} Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.146107 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c1d600bd28ddd30eb97781aa9e73764604da69509764e840b9fa1cd8a6698a" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.146107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.951625 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:52.598969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:52.605359 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:52.658254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.324644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.510994 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:53 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:53 crc kubenswrapper[4722]: > Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.535509 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.536942 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.784080 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:53 crc kubenswrapper[4722]: E0219 19:37:53.784542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerName="mariadb-database-create" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.784557 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerName="mariadb-database-create" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.784760 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerName="mariadb-database-create" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.785449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.787927 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.796360 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929853 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929887 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.932024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.949308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:54 crc kubenswrapper[4722]: I0219 19:37:54.109783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:55 crc kubenswrapper[4722]: I0219 19:37:55.992134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.019398 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.046483 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.064497 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.075451 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"2039a569-0bc4-49a4-9e82-08964729dc7b\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.075896 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.076059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.076804 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2039a569-0bc4-49a4-9e82-08964729dc7b" (UID: "2039a569-0bc4-49a4-9e82-08964729dc7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.077145 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.077229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"2039a569-0bc4-49a4-9e82-08964729dc7b\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.077853 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.079100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5778eec-eb7e-4137-85bd-761ac78b9fd7" (UID: "d5778eec-eb7e-4137-85bd-761ac78b9fd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.090968 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk" (OuterVolumeSpecName: "kube-api-access-tctdk") pod "2039a569-0bc4-49a4-9e82-08964729dc7b" (UID: "2039a569-0bc4-49a4-9e82-08964729dc7b"). InnerVolumeSpecName "kube-api-access-tctdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.092321 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.102999 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf" (OuterVolumeSpecName: "kube-api-access-r6ndf") pod "d5778eec-eb7e-4137-85bd-761ac78b9fd7" (UID: "d5778eec-eb7e-4137-85bd-761ac78b9fd7"). InnerVolumeSpecName "kube-api-access-r6ndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.103181 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.134773 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178661 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"a8d81d51-f4b7-4dec-9548-982de19b4742\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178832 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178876 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"25905c52-4074-40d4-826f-ef89353eeaa6\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"fe445148-46c0-4e8c-844a-51a5ce323370\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178944 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"25905c52-4074-40d4-826f-ef89353eeaa6\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178969 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"a8d81d51-f4b7-4dec-9548-982de19b4742\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"217ea569-e058-4f21-bbb7-d2f2648375eb\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"fe445148-46c0-4e8c-844a-51a5ce323370\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"217ea569-e058-4f21-bbb7-d2f2648375eb\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179579 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179602 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179612 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.181242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c78d063e-7cd7-4b41-b148-1a7f9a3f9914" (UID: "c78d063e-7cd7-4b41-b148-1a7f9a3f9914"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.181568 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8d81d51-f4b7-4dec-9548-982de19b4742" (UID: "a8d81d51-f4b7-4dec-9548-982de19b4742"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.182194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25905c52-4074-40d4-826f-ef89353eeaa6" (UID: "25905c52-4074-40d4-826f-ef89353eeaa6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.183165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe445148-46c0-4e8c-844a-51a5ce323370" (UID: "fe445148-46c0-4e8c-844a-51a5ce323370"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.184646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "217ea569-e058-4f21-bbb7-d2f2648375eb" (UID: "217ea569-e058-4f21-bbb7-d2f2648375eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.185094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg" (OuterVolumeSpecName: "kube-api-access-2dqjg") pod "c78d063e-7cd7-4b41-b148-1a7f9a3f9914" (UID: "c78d063e-7cd7-4b41-b148-1a7f9a3f9914"). InnerVolumeSpecName "kube-api-access-2dqjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.186903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc" (OuterVolumeSpecName: "kube-api-access-k6rpc") pod "217ea569-e058-4f21-bbb7-d2f2648375eb" (UID: "217ea569-e058-4f21-bbb7-d2f2648375eb"). InnerVolumeSpecName "kube-api-access-k6rpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.189645 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6" (OuterVolumeSpecName: "kube-api-access-xb5b6") pod "25905c52-4074-40d4-826f-ef89353eeaa6" (UID: "25905c52-4074-40d4-826f-ef89353eeaa6"). InnerVolumeSpecName "kube-api-access-xb5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.189741 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74" (OuterVolumeSpecName: "kube-api-access-jcg74") pod "fe445148-46c0-4e8c-844a-51a5ce323370" (UID: "fe445148-46c0-4e8c-844a-51a5ce323370"). InnerVolumeSpecName "kube-api-access-jcg74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.196309 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2" (OuterVolumeSpecName: "kube-api-access-vnvz2") pod "a8d81d51-f4b7-4dec-9548-982de19b4742" (UID: "a8d81d51-f4b7-4dec-9548-982de19b4742"). InnerVolumeSpecName "kube-api-access-vnvz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.207596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-67cbt" event={"ID":"fe445148-46c0-4e8c-844a-51a5ce323370","Type":"ContainerDied","Data":"8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.207638 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.207687 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.214829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr-config-x86ml" event={"ID":"ff6d44c7-0792-4927-8214-a62a52211e92","Type":"ContainerStarted","Data":"d80a4243d9916b25ce2acd62cd615d65a0ec3c0a009774f96f8f4e7954f803ae"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.216717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-nqj2r" event={"ID":"2039a569-0bc4-49a4-9e82-08964729dc7b","Type":"ContainerDied","Data":"ddb368090d84549e8613e6a8bf09662a248b3cdae6696eb51f4b8a9270abb3bd"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.216757 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb368090d84549e8613e6a8bf09662a248b3cdae6696eb51f4b8a9270abb3bd" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.216836 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.219546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-36cd-account-create-update-r5498" event={"ID":"a8d81d51-f4b7-4dec-9548-982de19b4742","Type":"ContainerDied","Data":"edc9e09b1a3536dad44773c79d12736ed4976a1f27a0aa500ba378c707d81315"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.219567 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.219578 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc9e09b1a3536dad44773c79d12736ed4976a1f27a0aa500ba378c707d81315" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.221558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"6b0b8c11d3f84c649ee2ff2c216706cc267f271274fe182ced5421a1cadf2672"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.231665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-89e2-account-create-update-7656w" event={"ID":"25905c52-4074-40d4-826f-ef89353eeaa6","Type":"ContainerDied","Data":"1a6bf93e45faaf4db27d8e2c20d1f7fc553e5da9aa844b25ea2e3a9760a7bce6"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.231706 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a6bf93e45faaf4db27d8e2c20d1f7fc553e5da9aa844b25ea2e3a9760a7bce6" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.231771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.237480 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.238075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eefc-account-create-update-h8n6c" event={"ID":"c78d063e-7cd7-4b41-b148-1a7f9a3f9914","Type":"ContainerDied","Data":"fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.238106 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.238167 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.246140 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7kcsc" event={"ID":"d5778eec-eb7e-4137-85bd-761ac78b9fd7","Type":"ContainerDied","Data":"414b6e4e860136d1c415be1caf745a8eda79544b985daf7b493e0eacf866bbda"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.246143 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.246190 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="414b6e4e860136d1c415be1caf745a8eda79544b985daf7b493e0eacf866bbda" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.251068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0f99-account-create-update-fflhf" event={"ID":"217ea569-e058-4f21-bbb7-d2f2648375eb","Type":"ContainerDied","Data":"7430b538256ae44d6944b8b2a907a2e6c7bd0b62c3823fced29b96d85a5d5b4f"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.251091 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7430b538256ae44d6944b8b2a907a2e6c7bd0b62c3823fced29b96d85a5d5b4f" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.251127 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.253376 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerStarted","Data":"3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.275259 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ws9fr" podStartSLOduration=4.323650632 podStartE2EDuration="12.275244292s" podCreationTimestamp="2026-02-19 19:37:44 +0000 UTC" firstStartedPulling="2026-02-19 19:37:47.806475959 +0000 UTC m=+1167.418826273" lastFinishedPulling="2026-02-19 19:37:55.758069609 +0000 UTC m=+1175.370419933" observedRunningTime="2026-02-19 19:37:56.266497659 +0000 UTC m=+1175.878847983" watchObservedRunningTime="2026-02-19 19:37:56.275244292 +0000 UTC m=+1175.887594616" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281775 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281806 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281819 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281831 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281843 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281854 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281865 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281877 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281888 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281900 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.264465 4722 generic.go:334] "Generic (PLEG): container finished" podID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerID="6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd" exitCode=0 Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.264572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerDied","Data":"6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd"} Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.266031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"c3f50c281e50ed54f3395a6f901a3b6701619fc9571af90792278c0bfe6cf504"} Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.267699 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff6d44c7-0792-4927-8214-a62a52211e92" containerID="39d3bd74fcad2b2ba6a5d3be195f9ef849a5a1caabbd2723eb1f1b100ba3c28c" exitCode=0 Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.268143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr-config-x86ml" event={"ID":"ff6d44c7-0792-4927-8214-a62a52211e92","Type":"ContainerDied","Data":"39d3bd74fcad2b2ba6a5d3be195f9ef849a5a1caabbd2723eb1f1b100ba3c28c"} Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.282352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"35ffd6ba2011e4907a541e24b62381502edf1432e54bb29ac20e152a37e39c1e"} Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.282841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"45cb14460651143f32ebf26d6bbc03bb8d397ca69b720b2da78b224475a8ed78"} Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.545708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6tmmr" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.723672 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.730939 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827459 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827509 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827546 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827624 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827742 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827830 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828452 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828513 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828532 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run" (OuterVolumeSpecName: "var-run") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828619 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts" (OuterVolumeSpecName: "scripts") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828930 4722 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828957 4722 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828969 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828982 4722 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828995 4722 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.831473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4" (OuterVolumeSpecName: "kube-api-access-28qs4") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "kube-api-access-28qs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.831558 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr" (OuterVolumeSpecName: "kube-api-access-mk9qr") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "kube-api-access-mk9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.833902 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.858567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.879713 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data" (OuterVolumeSpecName: "config-data") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930866 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930900 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930910 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930921 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930930 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.298517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr-config-x86ml" event={"ID":"ff6d44c7-0792-4927-8214-a62a52211e92","Type":"ContainerDied","Data":"d80a4243d9916b25ce2acd62cd615d65a0ec3c0a009774f96f8f4e7954f803ae"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.298561 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d80a4243d9916b25ce2acd62cd615d65a0ec3c0a009774f96f8f4e7954f803ae" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.298580 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.299929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"f7744f8998e67a032261d2c1555245665f3e18041cfa2083a87fc83fdee4de9e"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.314770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.314796 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerDied","Data":"f0c405b64fff456aecf84e0cb3dfbb788e3a93a4e01a434dac62f40edc004d0e"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.314837 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c405b64fff456aecf84e0cb3dfbb788e3a93a4e01a434dac62f40edc004d0e" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.319978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"0c36a68f8feba9e79e29839f692794ce920307708355ee8d3223d8832ac2ffdb"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.320035 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"9b22db2a298d3a844715acea22e05e515ceee761d9d6b2d67ff33b53edee69d9"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664307 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664756 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664784 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664812 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerName="glance-db-sync" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664822 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerName="glance-db-sync" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664843 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664852 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664870 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664879 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664901 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664911 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664919 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664935 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664943 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664955 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" containerName="ovn-config" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664963 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" containerName="ovn-config" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664977 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664985 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665202 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerName="glance-db-sync" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665237 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665256 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665279 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665292 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665308 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665317 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665681 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665708 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" containerName="ovn-config" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.673475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.680920 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743841 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743912 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.744239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.834211 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847593 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848022 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848911 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.849032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.869695 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.987342 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:00 crc kubenswrapper[4722]: I0219 19:38:00.346436 4722 generic.go:334] "Generic (PLEG): container finished" podID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerID="3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5" exitCode=0 Feb 19 19:38:00 crc kubenswrapper[4722]: I0219 19:38:00.346512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerDied","Data":"3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5"} Feb 19 19:38:00 crc kubenswrapper[4722]: I0219 19:38:00.491256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.085455 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" path="/var/lib/kubelet/pods/ff6d44c7-0792-4927-8214-a62a52211e92/volumes" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.357300 4722 generic.go:334] "Generic (PLEG): container finished" podID="49db2196-b62a-438c-974e-750f9c414846" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" exitCode=0 Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.357370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerDied","Data":"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.357400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerStarted","Data":"1730f03840824b8a57c2cfd81cc1d16b40832add768ea20e40c6c547ebe37c30"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.366138 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"b4c73d2acfb3bab47fe9f86d5b73dc7d2ad5ce017d727c42bd3e92ae8d48103e"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.366199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"191e07edde8d8015da65cc48db35a5f8d7a3b7c28981c7886317966771d73c53"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.366213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"278845d3dbe2d485391d238613e57cc6ad2cb9a1470b98c2007878ed5f3a1b7e"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.653694 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.683713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.683888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.683932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.695905 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm" (OuterVolumeSpecName: "kube-api-access-9zlfm") pod "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" (UID: "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb"). InnerVolumeSpecName "kube-api-access-9zlfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.709788 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" (UID: "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.734784 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data" (OuterVolumeSpecName: "config-data") pod "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" (UID: "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.785933 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.785965 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.785976 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.950298 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.410136 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerStarted","Data":"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017"} Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.410338 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.421442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"ddf4dc591f7c85df4c6746209688d14fe909b429746cbe7b920ee502df56cb84"} Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.427493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerDied","Data":"bab4e0dcd47bed11b26a97a238fcb572193e857fe8e5670dfa59d566460783b1"} Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.427537 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab4e0dcd47bed11b26a97a238fcb572193e857fe8e5670dfa59d566460783b1" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.427617 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.437501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" podStartSLOduration=3.437482934 podStartE2EDuration="3.437482934s" podCreationTimestamp="2026-02-19 19:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:02.432639444 +0000 UTC m=+1182.044989778" watchObservedRunningTime="2026-02-19 19:38:02.437482934 +0000 UTC m=+1182.049833248" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.589098 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.609660 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:02 crc kubenswrapper[4722]: E0219 19:38:02.610001 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerName="keystone-db-sync" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.610016 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerName="keystone-db-sync" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.613335 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerName="keystone-db-sync" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.614048 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618627 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.619111 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.673237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.692431 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.694314 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.718855 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.803528 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.806811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.806919 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807140 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807207 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807808 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.810846 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4h658" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.817529 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.817673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.817920 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.818117 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.818397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.821527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.827289 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.849026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.868479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909324 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909394 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909492 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909550 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909607 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.910982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.911719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.913862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.925028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.956404 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.958374 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.959624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.962739 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.975410 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.975594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4wknf" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.995040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015510 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.020019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.020206 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.024336 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.035486 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.035769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.037446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.037691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.042240 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.042509 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tj2ww" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.060088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.092702 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.117516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.117637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.117673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.144211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.144244 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.144256 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.146317 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.146394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.149674 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.149922 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.149885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.150139 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bnkq4" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.152444 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.162824 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.162965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.166972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.169375 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.171876 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.189656 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.191311 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.194444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7xcck" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.194653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.194884 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.197317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.221575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.225131 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.228836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.238418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.262879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.264414 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.267308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.276019 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.328517 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.328657 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329748 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329861 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329948 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330165 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.332902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.337597 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.356377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433321 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434053 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.435996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436130 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436183 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.438511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446492 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.451351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.451877 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.457607 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.457807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.458050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.465129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.465958 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.466330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.467453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.494991 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.504575 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.521414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"73bbded6588f4d10f80a3585928a42958ae2d126263b408adce35c4d3a24ec4b"} Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.541331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542519 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.543720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.544649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.545787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.549995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.562709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.570965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.673922 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.708718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.729867 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.774979 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.777364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781372 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781754 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s8kl" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.792899 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.861178 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.881613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.887050 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.887407 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.896621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.908941 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949137 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949263 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052839 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053000 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053014 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053142 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.058833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.059429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.063058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.071822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.079788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.080484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.100727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.148647 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.148685 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2019fecbddc337ddf53783637eb0008bc901e49a55294deb1e2d06fbb77c3ae3/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158414 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.164033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.164356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.170072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.199848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.213237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.246999 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.247074 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b323df4ccd136fd865256cd83fe693e56c32fbc8a05d96b41caf6babb703da86/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.282635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.282818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.342292 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.343074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.498508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.525557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.540508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerStarted","Data":"faf928cc455dc359a2347459ccaaf8574498adb59570eba6f898fdc7c69b0cd6"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.543376 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"b77e3cc63155037458b7636e61f90253b9f0f19e1fb29907d523fc36aff23280"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.544032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerStarted","Data":"9d69e23e43e8ab2aa747e1b227270b4fab24359a7aa862c4ab858b12cf3f9985"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.547809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" event={"ID":"6ec24fa4-f123-4210-83f8-915ca2a1a88e","Type":"ContainerStarted","Data":"f8fb40f06ecabee91859e66632484f9e76d88441037ec83802ca97f9f4dee4d3"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.564464 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3f1f109-9754-4525-b5e8-dbf86ba52f2b" containerID="f7744f8998e67a032261d2c1555245665f3e18041cfa2083a87fc83fdee4de9e" exitCode=0 Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.564565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerDied","Data":"f7744f8998e67a032261d2c1555245665f3e18041cfa2083a87fc83fdee4de9e"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.565910 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" containerID="cri-o://1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" gracePeriod=10 Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.566132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerStarted","Data":"65597a01a3e59b230c7526b664301c7f8fdd9e898558a558f3adbb4bcd59ec0f"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.580530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.634856 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.724119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.761167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.777287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.919478 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:38:04 crc kubenswrapper[4722]: W0219 19:38:04.929854 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41216a8d_32f8_4ec6_ab65_5474453cad03.slice/crio-459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94 WatchSource:0}: Error finding container 459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94: Status 404 returned error can't find the container with id 459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.053287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:05 crc kubenswrapper[4722]: W0219 19:38:05.144717 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0987fde3_8329_4305_bd1c_efa7cf79306b.slice/crio-dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434 WatchSource:0}: Error finding container dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434: Status 404 returned error can't find the container with id dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.375047 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524280 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.549018 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7" (OuterVolumeSpecName: "kube-api-access-7m6p7") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "kube-api-access-7m6p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.592676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"1faa29ce27320ab22dc6db2828db88d540021f7a0832148de51b439f8684b1f0"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.607575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"0086b29404894dff5db29e861f6c264796c21706e5ca7150f1456fe9a82acdd8"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622004 4722 generic.go:334] "Generic (PLEG): container finished" podID="49db2196-b62a-438c-974e-750f9c414846" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" exitCode=0 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622035 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622182 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerDied","Data":"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerDied","Data":"1730f03840824b8a57c2cfd81cc1d16b40832add768ea20e40c6c547ebe37c30"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622263 4722 scope.go:117] "RemoveContainer" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.624935 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.629868 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.629910 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.657366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.659421 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.671258 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerID="28dac43a68d14b0866389b29dfa45b324fd9b1b009f51a0ef8654fd374d27cfe" exitCode=0 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.671536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" event={"ID":"6ec24fa4-f123-4210-83f8-915ca2a1a88e","Type":"ContainerDied","Data":"28dac43a68d14b0866389b29dfa45b324fd9b1b009f51a0ef8654fd374d27cfe"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.674425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config" (OuterVolumeSpecName: "config") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.705551 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.721602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerStarted","Data":"8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.754096 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.754134 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.754147 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.780285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerStarted","Data":"c55c99500a8dc3a393a869149de80e388347c4c52dbc3f1981dc5cba2b917f9a"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.796649 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.808473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.826077 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ctdw7" podStartSLOduration=3.826052082 podStartE2EDuration="3.826052082s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:05.780111703 +0000 UTC m=+1185.392462027" watchObservedRunningTime="2026-02-19 19:38:05.826052082 +0000 UTC m=+1185.438402406" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.836611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerStarted","Data":"a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.875416 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.887980 4722 scope.go:117] "RemoveContainer" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.889197 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7b98l" podStartSLOduration=3.889185107 podStartE2EDuration="3.889185107s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:05.869864215 +0000 UTC m=+1185.482214539" watchObservedRunningTime="2026-02-19 19:38:05.889185107 +0000 UTC m=+1185.501535431" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.942839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"d44927022c5675b593b112d3c323999589b2f15cd28af0dc8b1a34c98596e11d"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.943193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"406e254384848462bf0f562451ea2560c26093d401090dc9aacb9821506ef209"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.953403 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.956207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerStarted","Data":"b485d2ccfdc9766193d0fa763ea0b9af82b812effcaae62a566a8b1ce25316b5"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.972411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerStarted","Data":"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.972452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerStarted","Data":"dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.975188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerStarted","Data":"459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94"} Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.008942 4722 scope.go:117] "RemoveContainer" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" Feb 19 19:38:06 crc kubenswrapper[4722]: E0219 19:38:06.021688 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017\": container with ID starting with 1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017 not found: ID does not exist" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.021734 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017"} err="failed to get container status \"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017\": rpc error: code = NotFound desc = could not find container \"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017\": container with ID starting with 1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017 not found: ID does not exist" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.021759 4722 scope.go:117] "RemoveContainer" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" Feb 19 19:38:06 crc kubenswrapper[4722]: E0219 19:38:06.039439 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f\": container with ID starting with ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f not found: ID does not exist" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.039481 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f"} err="failed to get container status \"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f\": rpc error: code = NotFound desc = could not find container \"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f\": container with ID starting with ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f not found: ID does not exist" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.202234 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.219053 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.422656 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.497968 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498107 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498423 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.516324 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w" (OuterVolumeSpecName: "kube-api-access-tkc7w") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "kube-api-access-tkc7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.551941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.568682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.579053 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.580209 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config" (OuterVolumeSpecName: "config") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603417 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603448 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603459 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603472 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603485 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.034977 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"dbaeb2907a7d4f1a8075471a7f624d26c20a73faf94a2ad21e1504b734de3c4e"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.035039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"8a1bc9bf3676a2530619b43c1e910cbeb74b46fe2d4d77c2a0f01940d7d90b78"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.035056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"9b92ed5967063c1aee12781aad0355a1be7a5579b64ae61784a503f110be9780"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038049 4722 generic.go:334] "Generic (PLEG): container finished" podID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" exitCode=0 Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerDied","Data":"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038126 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerStarted","Data":"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.068274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.068252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" event={"ID":"6ec24fa4-f123-4210-83f8-915ca2a1a88e","Type":"ContainerDied","Data":"f8fb40f06ecabee91859e66632484f9e76d88441037ec83802ca97f9f4dee4d3"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.069273 4722 scope.go:117] "RemoveContainer" containerID="28dac43a68d14b0866389b29dfa45b324fd9b1b009f51a0ef8654fd374d27cfe" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.101389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=41.491530975 podStartE2EDuration="48.101362235s" podCreationTimestamp="2026-02-19 19:37:19 +0000 UTC" firstStartedPulling="2026-02-19 19:37:56.251030728 +0000 UTC m=+1175.863381052" lastFinishedPulling="2026-02-19 19:38:02.860861988 +0000 UTC m=+1182.473212312" observedRunningTime="2026-02-19 19:38:07.079190154 +0000 UTC m=+1186.691540488" watchObservedRunningTime="2026-02-19 19:38:07.101362235 +0000 UTC m=+1186.713712569" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.126023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" podStartSLOduration=4.125999862 podStartE2EDuration="4.125999862s" podCreationTimestamp="2026-02-19 19:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:07.121376717 +0000 UTC m=+1186.733727041" watchObservedRunningTime="2026-02-19 19:38:07.125999862 +0000 UTC m=+1186.738350186" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.176087 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49db2196-b62a-438c-974e-750f9c414846" path="/var/lib/kubelet/pods/49db2196-b62a-438c-974e-750f9c414846/volumes" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.177073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerStarted","Data":"74d2769ab4752d1feeca0ef2edcd424d998dd7f01e76629b5fdbd1920be6013a"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.177108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerStarted","Data":"f05ea5f4636b64bd38579945e16464ca01ab6cde2bcc3d0ac468f593dd5c2f4e"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.221897 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.245912 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.395611 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435191 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:07 crc kubenswrapper[4722]: E0219 19:38:07.435682 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435704 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" Feb 19 19:38:07 crc kubenswrapper[4722]: E0219 19:38:07.435716 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435724 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: E0219 19:38:07.435748 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435756 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435970 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.436009 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.437439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.444837 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.449735 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.633674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.633943 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.634185 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.635189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.635902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.649939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.823287 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.148042 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerStarted","Data":"26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d"} Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.153489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerStarted","Data":"0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c"} Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.163967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"8dc16b1e074fd3091b107bc3bda1e24e49c55a05b0a2a77d6492836d8b81cf1e"} Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.350274 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:08 crc kubenswrapper[4722]: W0219 19:38:08.396940 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0396bebf_310d_43c3_a0f5_e8cddf9c3cb0.slice/crio-ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf WatchSource:0}: Error finding container ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf: Status 404 returned error can't find the container with id ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.146526 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" path="/var/lib/kubelet/pods/6ec24fa4-f123-4210-83f8-915ca2a1a88e/volumes" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.192431 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerStarted","Data":"8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.192634 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" containerID="cri-o://26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.192721 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" containerID="cri-o://8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.208314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerStarted","Data":"17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.208451 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" containerID="cri-o://0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.208648 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" containerID="cri-o://17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.229654 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.229611387 podStartE2EDuration="7.229611387s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:09.216777308 +0000 UTC m=+1188.829127632" watchObservedRunningTime="2026-02-19 19:38:09.229611387 +0000 UTC m=+1188.841961711" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.250060 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.250033482 podStartE2EDuration="7.250033482s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:09.242060854 +0000 UTC m=+1188.854411178" watchObservedRunningTime="2026-02-19 19:38:09.250033482 +0000 UTC m=+1188.862383806" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.262934 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"3745c2c1fca541ddbd93814a2b3c6f93a82b174021d51f8029526bbe280b334b"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.266851 4722 generic.go:334] "Generic (PLEG): container finished" podID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerID="e6f983685272a18f6384c33405c42fd7cac9d9c7919a092034ad166f31cb8a76" exitCode=0 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.267104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerDied","Data":"e6f983685272a18f6384c33405c42fd7cac9d9c7919a092034ad166f31cb8a76"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.267907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerStarted","Data":"ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.268373 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" containerID="cri-o://abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" gracePeriod=10 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.303271 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.303143715 podStartE2EDuration="19.303143715s" podCreationTimestamp="2026-02-19 19:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:09.295824168 +0000 UTC m=+1188.908174512" watchObservedRunningTime="2026-02-19 19:38:09.303143715 +0000 UTC m=+1188.915494039" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.820202 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006218 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006298 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006372 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.016945 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx" (OuterVolumeSpecName: "kube-api-access-t7qpx") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "kube-api-access-t7qpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.070796 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.089294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.101715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config" (OuterVolumeSpecName: "config") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109142 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109209 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109219 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109228 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.135656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.211264 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.287845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerStarted","Data":"ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.288042 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297238 4722 generic.go:334] "Generic (PLEG): container finished" podID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerID="8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171" exitCode=0 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297278 4722 generic.go:334] "Generic (PLEG): container finished" podID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerID="26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d" exitCode=143 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerDied","Data":"8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerDied","Data":"26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerDied","Data":"74d2769ab4752d1feeca0ef2edcd424d998dd7f01e76629b5fdbd1920be6013a"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297419 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d2769ab4752d1feeca0ef2edcd424d998dd7f01e76629b5fdbd1920be6013a" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.300754 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303826 4722 generic.go:334] "Generic (PLEG): container finished" podID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerID="17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b" exitCode=0 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303856 4722 generic.go:334] "Generic (PLEG): container finished" podID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerID="0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c" exitCode=143 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerDied","Data":"17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerDied","Data":"0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerDied","Data":"f05ea5f4636b64bd38579945e16464ca01ab6cde2bcc3d0ac468f593dd5c2f4e"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303963 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f05ea5f4636b64bd38579945e16464ca01ab6cde2bcc3d0ac468f593dd5c2f4e" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.307423 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" podStartSLOduration=3.307411243 podStartE2EDuration="3.307411243s" podCreationTimestamp="2026-02-19 19:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:10.304712539 +0000 UTC m=+1189.917062863" watchObservedRunningTime="2026-02-19 19:38:10.307411243 +0000 UTC m=+1189.919761577" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317688 4722 generic.go:334] "Generic (PLEG): container finished" podID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" exitCode=0 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317741 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerDied","Data":"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317860 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerDied","Data":"dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317881 4722 scope.go:117] "RemoveContainer" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.324253 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.384448 4722 scope.go:117] "RemoveContainer" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.406124 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415928 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416240 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416422 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416519 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.419320 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.419637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs" (OuterVolumeSpecName: "logs") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.419898 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs" (OuterVolumeSpecName: "logs") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.420661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.421971 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.422866 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts" (OuterVolumeSpecName: "scripts") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.423144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq" (OuterVolumeSpecName: "kube-api-access-7vsfq") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "kube-api-access-7vsfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.426051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts" (OuterVolumeSpecName: "scripts") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.426482 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4" (OuterVolumeSpecName: "kube-api-access-l8lj4") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "kube-api-access-l8lj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.438626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (OuterVolumeSpecName: "glance") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.439614 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (OuterVolumeSpecName: "glance") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.455754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.471958 4722 scope.go:117] "RemoveContainer" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" Feb 19 19:38:10 crc kubenswrapper[4722]: E0219 19:38:10.473069 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba\": container with ID starting with abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba not found: ID does not exist" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.473110 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba"} err="failed to get container status \"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba\": rpc error: code = NotFound desc = could not find container \"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba\": container with ID starting with abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba not found: ID does not exist" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.473168 4722 scope.go:117] "RemoveContainer" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" Feb 19 19:38:10 crc kubenswrapper[4722]: E0219 19:38:10.473618 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9\": container with ID starting with 981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9 not found: ID does not exist" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.473643 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9"} err="failed to get container status \"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9\": rpc error: code = NotFound desc = could not find container \"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9\": container with ID starting with 981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9 not found: ID does not exist" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.478961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.484220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.505786 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data" (OuterVolumeSpecName: "config-data") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.506462 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data" (OuterVolumeSpecName: "config-data") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.512131 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.520959 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.520989 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521026 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521040 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521053 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521068 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521078 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521087 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521098 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521106 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521116 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521125 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521133 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521143 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521164 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521173 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.544050 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.544286 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5") on node "crc" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.558563 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.558732 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d") on node "crc" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.588956 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.623562 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.623604 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.107496 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" path="/var/lib/kubelet/pods/0987fde3-8329-4305-bd1c-efa7cf79306b/volumes" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.334963 4722 generic.go:334] "Generic (PLEG): container finished" podID="09a108ba-bb88-4799-a230-638cabf304b0" containerID="8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8" exitCode=0 Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.336221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerDied","Data":"8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8"} Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.336647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.337250 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.381032 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.399362 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.415038 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.423173 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.429769 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430213 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430275 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430333 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430379 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430449 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="init" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430496 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="init" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430586 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430648 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430694 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430747 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430799 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431010 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431074 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431129 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431200 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431250 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.432226 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.446165 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.447569 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.447699 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.458197 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s8kl" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.458516 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.458785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.459056 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.459365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.459182 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.476039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556744 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659135 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659164 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659184 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659200 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659307 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.660060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.660082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.660772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.664482 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.664539 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b323df4ccd136fd865256cd83fe693e56c32fbc8a05d96b41caf6babb703da86/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.664749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.665530 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.665569 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2019fecbddc337ddf53783637eb0008bc901e49a55294deb1e2d06fbb77c3ae3/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.666955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.669351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.670946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.671193 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.676481 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.676534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.679891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.680775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.688418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.719355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.723483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.783300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.799190 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.799245 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.807028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:13 crc kubenswrapper[4722]: I0219 19:38:13.085440 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" path="/var/lib/kubelet/pods/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e/volumes" Feb 19 19:38:13 crc kubenswrapper[4722]: I0219 19:38:13.086860 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" path="/var/lib/kubelet/pods/b5e004a3-da53-4fbb-a396-52e33d205e2e/volumes" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.895376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.949325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.949390 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.949822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts" (OuterVolumeSpecName: "scripts") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.951134 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6" (OuterVolumeSpecName: "kube-api-access-4xjh6") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "kube-api-access-4xjh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.974005 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data" (OuterVolumeSpecName: "config-data") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.980359 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043017 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043393 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043405 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043415 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043423 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043431 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.381406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerDied","Data":"9d69e23e43e8ab2aa747e1b227270b4fab24359a7aa862c4ab858b12cf3f9985"} Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.381460 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.381463 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d69e23e43e8ab2aa747e1b227270b4fab24359a7aa862c4ab858b12cf3f9985" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.988257 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.996091 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.067470 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:38:17 crc kubenswrapper[4722]: E0219 19:38:17.068345 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a108ba-bb88-4799-a230-638cabf304b0" containerName="keystone-bootstrap" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.068496 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a108ba-bb88-4799-a230-638cabf304b0" containerName="keystone-bootstrap" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.069539 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a108ba-bb88-4799-a230-638cabf304b0" containerName="keystone-bootstrap" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.072094 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074236 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074386 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074600 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.110452 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a108ba-bb88-4799-a230-638cabf304b0" path="/var/lib/kubelet/pods/09a108ba-bb88-4799-a230-638cabf304b0/volumes" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.111530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168691 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.269898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.269998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.274329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.274801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.277781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.282172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.285697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.290813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.395526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.824401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.881626 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.881871 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" containerID="cri-o://acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3" gracePeriod=10 Feb 19 19:38:18 crc kubenswrapper[4722]: I0219 19:38:18.401820 4722 generic.go:334] "Generic (PLEG): container finished" podID="b12e3334-cc75-47af-870a-3d86164cb249" containerID="acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3" exitCode=0 Feb 19 19:38:18 crc kubenswrapper[4722]: I0219 19:38:18.401904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerDied","Data":"acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3"} Feb 19 19:38:19 crc kubenswrapper[4722]: I0219 19:38:19.672064 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 19 19:38:20 crc kubenswrapper[4722]: I0219 19:38:20.589512 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:20 crc kubenswrapper[4722]: I0219 19:38:20.596299 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:21 crc kubenswrapper[4722]: I0219 19:38:21.436970 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:24 crc kubenswrapper[4722]: I0219 19:38:24.458379 4722 generic.go:334] "Generic (PLEG): container finished" podID="eab1ce59-2254-419a-bab0-cf5e87888634" containerID="a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b" exitCode=0 Feb 19 19:38:24 crc kubenswrapper[4722]: I0219 19:38:24.458456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerDied","Data":"a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b"} Feb 19 19:38:24 crc kubenswrapper[4722]: I0219 19:38:24.672570 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 19 19:38:29 crc kubenswrapper[4722]: I0219 19:38:29.673278 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 19 19:38:29 crc kubenswrapper[4722]: I0219 19:38:29.674119 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.573851 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerDied","Data":"faf928cc455dc359a2347459ccaaf8574498adb59570eba6f898fdc7c69b0cd6"} Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.574141 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf928cc455dc359a2347459ccaaf8574498adb59570eba6f898fdc7c69b0cd6" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.672090 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.774225 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"eab1ce59-2254-419a-bab0-cf5e87888634\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.774362 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"eab1ce59-2254-419a-bab0-cf5e87888634\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.774457 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"eab1ce59-2254-419a-bab0-cf5e87888634\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.800399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt" (OuterVolumeSpecName: "kube-api-access-4n4nt") pod "eab1ce59-2254-419a-bab0-cf5e87888634" (UID: "eab1ce59-2254-419a-bab0-cf5e87888634"). InnerVolumeSpecName "kube-api-access-4n4nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.812454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config" (OuterVolumeSpecName: "config") pod "eab1ce59-2254-419a-bab0-cf5e87888634" (UID: "eab1ce59-2254-419a-bab0-cf5e87888634"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.821253 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eab1ce59-2254-419a-bab0-cf5e87888634" (UID: "eab1ce59-2254-419a-bab0-cf5e87888634"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.896395 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.896739 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.896756 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.131413 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.581780 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.860029 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.860214 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cdjl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nldcm_openstack(512a4c5e-3ea6-42a8-9f83-8c0e5375891d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.862175 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nldcm" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.944534 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.944988 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" containerName="neutron-db-sync" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.945005 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" containerName="neutron-db-sync" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.945209 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" containerName="neutron-db-sync" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.946264 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.958309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.100743 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.102368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106615 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4wknf" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106916 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.112265 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.115891 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.115950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.117082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.117299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.117774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.151813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.271834 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.325368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.329737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.331043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.332385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.339422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.423956 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.282790 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-nldcm" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.983526 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.983681 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvk66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-lnf5k_openstack(9c2453a9-4c81-4256-b52d-edb69c12c7d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.984827 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-lnf5k" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" Feb 19 19:38:35 crc kubenswrapper[4722]: W0219 19:38:35.000791 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c5d98f_45b4_4fd8_876b_3471da720a4b.slice/crio-8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d WatchSource:0}: Error finding container 8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d: Status 404 returned error can't find the container with id 8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.121909 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.253814 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.253884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.253921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.254126 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.254220 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.261760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx" (OuterVolumeSpecName: "kube-api-access-bs5nx") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "kube-api-access-bs5nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.283967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerDied","Data":"1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325"} Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.284046 4722 scope.go:117] "RemoveContainer" containerID="acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.284226 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.290569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerStarted","Data":"8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d"} Feb 19 19:38:35 crc kubenswrapper[4722]: E0219 19:38:35.291564 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-lnf5k" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.320843 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.324557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config" (OuterVolumeSpecName: "config") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.340211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.342858 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356187 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356219 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356229 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356239 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356250 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.636023 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.651668 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.870870 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:38:35 crc kubenswrapper[4722]: E0219 19:38:35.871589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="init" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.871616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="init" Feb 19 19:38:35 crc kubenswrapper[4722]: E0219 19:38:35.871638 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.871646 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.871883 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.873092 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.884371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.884923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.893585 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.069808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.069881 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.069908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070177 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.075493 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.076275 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.076417 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.077332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.079576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.082608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.089758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.195048 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:37 crc kubenswrapper[4722]: I0219 19:38:37.081941 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12e3334-cc75-47af-870a-3d86164cb249" path="/var/lib/kubelet/pods/b12e3334-cc75-47af-870a-3d86164cb249/volumes" Feb 19 19:38:37 crc kubenswrapper[4722]: I0219 19:38:37.898513 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:39 crc kubenswrapper[4722]: I0219 19:38:39.672278 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Feb 19 19:38:39 crc kubenswrapper[4722]: W0219 19:38:39.975444 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58e51a47_7d37_46de_96cc_609365fab496.slice/crio-9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac WatchSource:0}: Error finding container 9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac: Status 404 returned error can't find the container with id 9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac Feb 19 19:38:40 crc kubenswrapper[4722]: I0219 19:38:40.338254 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerStarted","Data":"9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac"} Feb 19 19:38:40 crc kubenswrapper[4722]: I0219 19:38:40.425122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:38:40 crc kubenswrapper[4722]: I0219 19:38:40.552248 4722 scope.go:117] "RemoveContainer" containerID="58f8459d38255bc0ee2a3b1d7c9b5ab8e43bfd9e3de2e5dd8ef6021c2a7233ed" Feb 19 19:38:40 crc kubenswrapper[4722]: W0219 19:38:40.591514 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6175472a_2fd6_4b07_bcb1_4e441a4587aa.slice/crio-11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049 WatchSource:0}: Error finding container 11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049: Status 404 returned error can't find the container with id 11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049 Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.601945 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.601994 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.602202 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7zht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-xdgs2_openstack(fb399ce1-7269-4d99-9140-0d1d33a6fd6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.603357 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-xdgs2" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.089584 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:38:41 crc kubenswrapper[4722]: W0219 19:38:41.132047 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf618be57_2b9f_4455_8de0_90379bc9d57b.slice/crio-b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea WatchSource:0}: Error finding container b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea: Status 404 returned error can't find the container with id b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.304501 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.349020 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.352268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerStarted","Data":"b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.353589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerStarted","Data":"90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.362052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerStarted","Data":"c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.362139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerStarted","Data":"11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.365200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerStarted","Data":"245b2a4bf08b03ca07fdc608528d3501f8e470227ac611d75e1e28818470fe64"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.370241 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zrwzj" podStartSLOduration=8.316199213 podStartE2EDuration="38.370226939s" podCreationTimestamp="2026-02-19 19:38:03 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.947520156 +0000 UTC m=+1184.559870480" lastFinishedPulling="2026-02-19 19:38:35.001547882 +0000 UTC m=+1214.613898206" observedRunningTime="2026-02-19 19:38:41.368463903 +0000 UTC m=+1220.980814247" watchObservedRunningTime="2026-02-19 19:38:41.370226939 +0000 UTC m=+1220.982577253" Feb 19 19:38:41 crc kubenswrapper[4722]: E0219 19:38:41.378645 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-xdgs2" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.399456 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f6zx8" podStartSLOduration=24.399424066999998 podStartE2EDuration="24.399424067s" podCreationTimestamp="2026-02-19 19:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:41.388426674 +0000 UTC m=+1221.000776999" watchObservedRunningTime="2026-02-19 19:38:41.399424067 +0000 UTC m=+1221.011774391" Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.421118 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.805357 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.805660 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.390336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerStarted","Data":"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.390981 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerStarted","Data":"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.395702 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerStarted","Data":"6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.395736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerStarted","Data":"e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.396533 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.402597 4722 generic.go:334] "Generic (PLEG): container finished" podID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerID="4dec94c6774384698a0cf861b554d74fb1ddd8514338b3e11d17056ce861d124" exitCode=0 Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.402829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerDied","Data":"4dec94c6774384698a0cf861b554d74fb1ddd8514338b3e11d17056ce861d124"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.409811 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerStarted","Data":"3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.409847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerStarted","Data":"0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerStarted","Data":"df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerStarted","Data":"6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerStarted","Data":"0d269d0087152d6edd92c6c1c2324f5e6566d6cbbbcd03d88628b974769fb6f5"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422717 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.451852 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d74fd689-q5qhb" podStartSLOduration=7.451833543 podStartE2EDuration="7.451833543s" podCreationTimestamp="2026-02-19 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.444307259 +0000 UTC m=+1222.056657583" watchObservedRunningTime="2026-02-19 19:38:42.451833543 +0000 UTC m=+1222.064183857" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.458402 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.458389777 podStartE2EDuration="31.458389777s" podCreationTimestamp="2026-02-19 19:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.421449028 +0000 UTC m=+1222.033799352" watchObservedRunningTime="2026-02-19 19:38:42.458389777 +0000 UTC m=+1222.070740101" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.494045 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.494029246 podStartE2EDuration="31.494029246s" podCreationTimestamp="2026-02-19 19:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.493245632 +0000 UTC m=+1222.105595956" watchObservedRunningTime="2026-02-19 19:38:42.494029246 +0000 UTC m=+1222.106379570" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.523988 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7445db86-7r6w9" podStartSLOduration=9.523958768 podStartE2EDuration="9.523958768s" podCreationTimestamp="2026-02-19 19:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.508666562 +0000 UTC m=+1222.121016906" watchObservedRunningTime="2026-02-19 19:38:42.523958768 +0000 UTC m=+1222.136309092" Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.464616 4722 generic.go:334] "Generic (PLEG): container finished" podID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerID="c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee" exitCode=0 Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.464909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerDied","Data":"c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.471002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerStarted","Data":"0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.471124 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.472956 4722 generic.go:334] "Generic (PLEG): container finished" podID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerID="90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6" exitCode=0 Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.473008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerDied","Data":"90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.486967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.502517 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" podStartSLOduration=14.502500304 podStartE2EDuration="14.502500304s" podCreationTimestamp="2026-02-19 19:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:46.50205859 +0000 UTC m=+1226.114408924" watchObservedRunningTime="2026-02-19 19:38:46.502500304 +0000 UTC m=+1226.114850628" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.142454 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.151618 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.311889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312752 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.313000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs" (OuterVolumeSpecName: "logs") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.313541 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9" (OuterVolumeSpecName: "kube-api-access-kmgm9") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "kube-api-access-kmgm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts" (OuterVolumeSpecName: "scripts") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8" (OuterVolumeSpecName: "kube-api-access-kdxx8") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "kube-api-access-kdxx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts" (OuterVolumeSpecName: "scripts") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.339590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data" (OuterVolumeSpecName: "config-data") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.344864 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data" (OuterVolumeSpecName: "config-data") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.345455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.345912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.414833 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.414959 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415030 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415089 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415142 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415211 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415261 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415312 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415376 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415436 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.540442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerDied","Data":"11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049"} Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.540505 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.540609 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.552371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerDied","Data":"459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94"} Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.553193 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.552402 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.555299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd"} Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.243446 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7cb5f76f4-hx5jh"] Feb 19 19:38:51 crc kubenswrapper[4722]: E0219 19:38:51.244099 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerName="placement-db-sync" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244112 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerName="placement-db-sync" Feb 19 19:38:51 crc kubenswrapper[4722]: E0219 19:38:51.244124 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerName="keystone-bootstrap" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244130 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerName="keystone-bootstrap" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244368 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerName="placement-db-sync" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244381 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerName="keystone-bootstrap" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.245099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259029 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cb5f76f4-hx5jh"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259335 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259674 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259773 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259958 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.260121 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-combined-ca-bundle\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-fernet-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-credential-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-internal-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-scripts\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-config-data\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-public-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gvk\" (UniqueName: \"kubernetes.io/projected/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-kube-api-access-c6gvk\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.362923 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.380121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.380304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.385643 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.385870 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.385989 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7xcck" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.386109 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.386624 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-credential-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-internal-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-scripts\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-config-data\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-public-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gvk\" (UniqueName: \"kubernetes.io/projected/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-kube-api-access-c6gvk\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-combined-ca-bundle\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439900 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-fernet-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.444855 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-credential-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.445142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-scripts\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.446268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-combined-ca-bundle\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.453737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-internal-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.455466 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-fernet-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.460263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-config-data\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.462757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-public-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.463722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gvk\" (UniqueName: \"kubernetes.io/projected/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-kube-api-access-c6gvk\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.541903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.542186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.542297 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.542834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.544197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.544585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.544769 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.545281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.545689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.547798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.548043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.548515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.548521 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.562789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.565635 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerStarted","Data":"30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd"} Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.567847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerStarted","Data":"fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6"} Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.595180 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lnf5k" podStartSLOduration=3.6166137149999997 podStartE2EDuration="49.595142154s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.623324989 +0000 UTC m=+1184.235675313" lastFinishedPulling="2026-02-19 19:38:50.601853428 +0000 UTC m=+1230.214203752" observedRunningTime="2026-02-19 19:38:51.590334045 +0000 UTC m=+1231.202684379" watchObservedRunningTime="2026-02-19 19:38:51.595142154 +0000 UTC m=+1231.207492478" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.601797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.625666 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nldcm" podStartSLOduration=3.332261608 podStartE2EDuration="49.625641044s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.304398955 +0000 UTC m=+1183.916749279" lastFinishedPulling="2026-02-19 19:38:50.597778391 +0000 UTC m=+1230.210128715" observedRunningTime="2026-02-19 19:38:51.617125659 +0000 UTC m=+1231.229475983" watchObservedRunningTime="2026-02-19 19:38:51.625641044 +0000 UTC m=+1231.237991368" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.674944 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cc7c8879d-tnbfs"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.676642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.686744 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc7c8879d-tnbfs"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.705729 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-internal-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzcg\" (UniqueName: \"kubernetes.io/projected/41b669ab-d733-4941-b134-b9ad19b38143-kube-api-access-8hzcg\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-combined-ca-bundle\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749209 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-scripts\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b669ab-d733-4941-b134-b9ad19b38143-logs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-public-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-config-data\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.784045 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.784105 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.808094 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.808465 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.850883 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.851037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.858075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzcg\" (UniqueName: \"kubernetes.io/projected/41b669ab-d733-4941-b134-b9ad19b38143-kube-api-access-8hzcg\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.859530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-combined-ca-bundle\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-scripts\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b669ab-d733-4941-b134-b9ad19b38143-logs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860723 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-public-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-config-data\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-internal-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.861889 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.862634 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b669ab-d733-4941-b134-b9ad19b38143-logs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.864506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-combined-ca-bundle\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.864948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-scripts\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.865860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-internal-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.866920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-public-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.866973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-config-data\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.885663 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.897667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzcg\" (UniqueName: \"kubernetes.io/projected/41b669ab-d733-4941-b134-b9ad19b38143-kube-api-access-8hzcg\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.040876 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.208691 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cb5f76f4-hx5jh"] Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.363838 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.580971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cb5f76f4-hx5jh" event={"ID":"32b3c2bb-2288-4e2e-a9c6-d19cfe651181","Type":"ContainerStarted","Data":"824de0511e4184bceffeffe595fe857a7445d6d50c5df6ccff862b78774504e1"} Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.581026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cb5f76f4-hx5jh" event={"ID":"32b3c2bb-2288-4e2e-a9c6-d19cfe651181","Type":"ContainerStarted","Data":"324f78b297395cee71e55b72591f8e7896ccf64ced6e26562e5702f64b3dffd4"} Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.581609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.583409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerStarted","Data":"c688d661fba42ba2a53e010b04af9f22dbacb7137f02c088f90b0645fc7ab228"} Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584102 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584352 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584370 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.669138 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7cb5f76f4-hx5jh" podStartSLOduration=1.669118873 podStartE2EDuration="1.669118873s" podCreationTimestamp="2026-02-19 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:52.610705745 +0000 UTC m=+1232.223056069" watchObservedRunningTime="2026-02-19 19:38:52.669118873 +0000 UTC m=+1232.281469197" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.685294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc7c8879d-tnbfs"] Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.275302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.354891 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.355239 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" containerID="cri-o://ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985" gracePeriod=10 Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.645114 4722 generic.go:334] "Generic (PLEG): container finished" podID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerID="ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985" exitCode=0 Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.645482 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerDied","Data":"ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerStarted","Data":"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerStarted","Data":"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677855 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677907 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.684141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc7c8879d-tnbfs" event={"ID":"41b669ab-d733-4941-b134-b9ad19b38143","Type":"ContainerStarted","Data":"f51bf694020dd5e39c5b1ce070330d37b6eaff405cc0ba31780c5a96b0a35ded"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.684214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc7c8879d-tnbfs" event={"ID":"41b669ab-d733-4941-b134-b9ad19b38143","Type":"ContainerStarted","Data":"8ec749216293740638000f28d158c0105a2cad692d91aa2514c37dfb1a5704af"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.736422 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cc6894556-2r5j6" podStartSLOduration=2.736400162 podStartE2EDuration="2.736400162s" podCreationTimestamp="2026-02-19 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:53.700725821 +0000 UTC m=+1233.313076155" watchObservedRunningTime="2026-02-19 19:38:53.736400162 +0000 UTC m=+1233.348750496" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.320036 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427184 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427776 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.428038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.455399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp" (OuterVolumeSpecName: "kube-api-access-65hgp") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "kube-api-access-65hgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.486963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.496101 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.530702 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.530742 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.530759 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.531167 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config" (OuterVolumeSpecName: "config") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.532057 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.595561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.634533 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.634578 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.634588 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.713225 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.713244 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.714110 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.720430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerDied","Data":"ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf"} Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.720501 4722 scope.go:117] "RemoveContainer" containerID="ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.766208 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.776098 4722 scope.go:117] "RemoveContainer" containerID="e6f983685272a18f6384c33405c42fd7cac9d9c7919a092034ad166f31cb8a76" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.779187 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.083654 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" path="/var/lib/kubelet/pods/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0/volumes" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.725833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerStarted","Data":"8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db"} Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.736612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc7c8879d-tnbfs" event={"ID":"41b669ab-d733-4941-b134-b9ad19b38143","Type":"ContainerStarted","Data":"45398e9194cf34a912d856e8546e54bf77cb6213b237928b64ffef777fb10ae1"} Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.736649 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.736669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.756445 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-xdgs2" podStartSLOduration=4.810693082 podStartE2EDuration="53.756426917s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.77728589 +0000 UTC m=+1184.389636214" lastFinishedPulling="2026-02-19 19:38:53.723019725 +0000 UTC m=+1233.335370049" observedRunningTime="2026-02-19 19:38:55.750723479 +0000 UTC m=+1235.363073813" watchObservedRunningTime="2026-02-19 19:38:55.756426917 +0000 UTC m=+1235.368777241" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.782907 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cc7c8879d-tnbfs" podStartSLOduration=4.78288841 podStartE2EDuration="4.78288841s" podCreationTimestamp="2026-02-19 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:55.776498041 +0000 UTC m=+1235.388848365" watchObservedRunningTime="2026-02-19 19:38:55.78288841 +0000 UTC m=+1235.395238735" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.967044 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.967186 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.973529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.979481 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.979602 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:56 crc kubenswrapper[4722]: I0219 19:38:56.187683 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:38:56 crc kubenswrapper[4722]: I0219 19:38:56.750675 4722 generic.go:334] "Generic (PLEG): container finished" podID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerID="30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd" exitCode=0 Feb 19 19:38:56 crc kubenswrapper[4722]: I0219 19:38:56.750745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerDied","Data":"30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.443297 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.556579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.556670 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.556853 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.562427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9c2453a9-4c81-4256-b52d-edb69c12c7d7" (UID: "9c2453a9-4c81-4256-b52d-edb69c12c7d7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.562486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66" (OuterVolumeSpecName: "kube-api-access-dvk66") pod "9c2453a9-4c81-4256-b52d-edb69c12c7d7" (UID: "9c2453a9-4c81-4256-b52d-edb69c12c7d7"). InnerVolumeSpecName "kube-api-access-dvk66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.587878 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c2453a9-4c81-4256-b52d-edb69c12c7d7" (UID: "9c2453a9-4c81-4256-b52d-edb69c12c7d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.659096 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.659145 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.659202 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.788910 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789009 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" containerID="cri-o://6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789058 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789097 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" containerID="cri-o://891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789259 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" containerID="cri-o://5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789450 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" containerID="cri-o://81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.792250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerDied","Data":"c55c99500a8dc3a393a869149de80e388347c4c52dbc3f1981dc5cba2b917f9a"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.792284 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c55c99500a8dc3a393a869149de80e388347c4c52dbc3f1981dc5cba2b917f9a" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.792343 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.798839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerDied","Data":"fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.798999 4722 generic.go:334] "Generic (PLEG): container finished" podID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerID="fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6" exitCode=0 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.801338 4722 generic.go:334] "Generic (PLEG): container finished" podID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerID="8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db" exitCode=0 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.801390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerDied","Data":"8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.815805 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.294954117 podStartE2EDuration="58.815782673s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.777366602 +0000 UTC m=+1184.389716916" lastFinishedPulling="2026-02-19 19:39:00.298195148 +0000 UTC m=+1239.910545472" observedRunningTime="2026-02-19 19:39:00.811976744 +0000 UTC m=+1240.424327078" watchObservedRunningTime="2026-02-19 19:39:00.815782673 +0000 UTC m=+1240.428133007" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814058 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" exitCode=0 Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814461 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" exitCode=2 Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814475 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" exitCode=0 Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c"} Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd"} Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14"} Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864057 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6767bd5ccf-ggbrg"] Feb 19 19:39:01 crc kubenswrapper[4722]: E0219 19:39:01.864477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="init" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864490 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="init" Feb 19 19:39:01 crc kubenswrapper[4722]: E0219 19:39:01.864500 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864505 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" Feb 19 19:39:01 crc kubenswrapper[4722]: E0219 19:39:01.864520 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerName="barbican-db-sync" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864529 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerName="barbican-db-sync" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864701 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerName="barbican-db-sync" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864720 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.870424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.881361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.881707 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.888927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tj2ww" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.902005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6767bd5ccf-ggbrg"] Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.964251 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-98b54b474-9tfhf"] Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.976254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.979904 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.986576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data-custom\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.986938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66f5042d-2b30-4ac4-8594-cfc0f9590460-logs\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.986992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.987099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-combined-ca-bundle\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.987183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc24m\" (UniqueName: \"kubernetes.io/projected/66f5042d-2b30-4ac4-8594-cfc0f9590460-kube-api-access-zc24m\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.043548 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-98b54b474-9tfhf"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.074897 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.079180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088767 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088862 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-combined-ca-bundle\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ffdf9d-f932-419b-be31-9f38358d2db5-logs\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8zg\" (UniqueName: \"kubernetes.io/projected/96ffdf9d-f932-419b-be31-9f38358d2db5-kube-api-access-tc8zg\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.089028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66f5042d-2b30-4ac4-8594-cfc0f9590460-logs\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.089135 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.089489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66f5042d-2b30-4ac4-8594-cfc0f9590460-logs\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.090117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data-custom\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.090262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-combined-ca-bundle\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.100371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.102970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc24m\" (UniqueName: \"kubernetes.io/projected/66f5042d-2b30-4ac4-8594-cfc0f9590460-kube-api-access-zc24m\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.103098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data-custom\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.104325 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.105739 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-combined-ca-bundle\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.108750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data-custom\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.117849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc24m\" (UniqueName: \"kubernetes.io/projected/66f5042d-2b30-4ac4-8594-cfc0f9590460-kube-api-access-zc24m\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.191201 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.192936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.198537 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-combined-ca-bundle\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ffdf9d-f932-419b-be31-9f38358d2db5-logs\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8zg\" (UniqueName: \"kubernetes.io/projected/96ffdf9d-f932-419b-be31-9f38358d2db5-kube-api-access-tc8zg\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204460 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data-custom\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.205983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ffdf9d-f932-419b-be31-9f38358d2db5-logs\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.211452 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data-custom\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.216676 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-combined-ca-bundle\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.218255 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.220760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.228927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8zg\" (UniqueName: \"kubernetes.io/projected/96ffdf9d-f932-419b-be31-9f38358d2db5-kube-api-access-tc8zg\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.229514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.306901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307977 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.309040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.309618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.310020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.310187 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.322688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.328080 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.410994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411067 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411201 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.414803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.422009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.424809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.431613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.450173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.518104 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.530590 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.532577 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618209 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618374 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618485 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.622485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.625127 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts" (OuterVolumeSpecName: "scripts") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.625237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht" (OuterVolumeSpecName: "kube-api-access-l7zht") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "kube-api-access-l7zht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.629411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.633554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs" (OuterVolumeSpecName: "certs") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.635683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8" (OuterVolumeSpecName: "kube-api-access-cdjl8") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "kube-api-access-cdjl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.639114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts" (OuterVolumeSpecName: "scripts") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.661966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.663865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data" (OuterVolumeSpecName: "config-data") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.674975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.694644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data" (OuterVolumeSpecName: "config-data") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.713707 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726728 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726762 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726779 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726792 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726801 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726810 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726821 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726829 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726837 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726844 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726852 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.752209 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6767bd5ccf-ggbrg"] Feb 19 19:39:02 crc kubenswrapper[4722]: W0219 19:39:02.753614 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f5042d_2b30_4ac4_8594_cfc0f9590460.slice/crio-f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8 WatchSource:0}: Error finding container f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8: Status 404 returned error can't find the container with id f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8 Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.831350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerDied","Data":"65597a01a3e59b230c7526b664301c7f8fdd9e898558a558f3adbb4bcd59ec0f"} Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.831401 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65597a01a3e59b230c7526b664301c7f8fdd9e898558a558f3adbb4bcd59ec0f" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.831368 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.834906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" event={"ID":"66f5042d-2b30-4ac4-8594-cfc0f9590460","Type":"ContainerStarted","Data":"f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8"} Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.839378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerDied","Data":"b485d2ccfdc9766193d0fa763ea0b9af82b812effcaae62a566a8b1ce25316b5"} Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.839414 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b485d2ccfdc9766193d0fa763ea0b9af82b812effcaae62a566a8b1ce25316b5" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.839479 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.013161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-98b54b474-9tfhf"] Feb 19 19:39:03 crc kubenswrapper[4722]: W0219 19:39:03.016602 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ffdf9d_f932_419b_be31_9f38358d2db5.slice/crio-210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2 WatchSource:0}: Error finding container 210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2: Status 404 returned error can't find the container with id 210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.060882 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:39:03 crc kubenswrapper[4722]: E0219 19:39:03.061328 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerName="cloudkitty-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerName="cloudkitty-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: E0219 19:39:03.061366 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerName="cinder-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061372 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerName="cinder-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061550 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerName="cloudkitty-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerName="cinder-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.062288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.066866 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.066914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.067024 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bnkq4" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.074459 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.074695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.102109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:39:03 crc kubenswrapper[4722]: W0219 19:39:03.128738 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ef6823_2e42_41cf_8eda_f9ea51c8c6f5.slice/crio-c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d WatchSource:0}: Error finding container c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d: Status 404 returned error can't find the container with id c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.135035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.136916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.155861 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.158759 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.161752 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4h658" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.162091 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.163301 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.175850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.185085 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.236429 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237766 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237899 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.238025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.238043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.238101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.242116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.249451 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.271036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.282260 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.317559 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.320528 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.325114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.325320 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.341860 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.369036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.407223 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.413854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.413928 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.414764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.415821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.415931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.415969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.416023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.416100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.416235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417127 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.420891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.421959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.422439 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.422530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.439058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.439898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.449911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.462275 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.505244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519964 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.521730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.528080 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.528754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.529320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.534380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.540573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.558369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.622510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.622698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.624419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.624590 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.625425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.642781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.772356 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.772578 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" containerID="cri-o://e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0" gracePeriod=30 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.773000 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" containerID="cri-o://6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0" gracePeriod=30 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.819996 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8694c7b8f7-2td8g"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.822059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.825453 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9696/\": EOF" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.881093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8694c7b8f7-2td8g"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.885127 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.915368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" event={"ID":"96ffdf9d-f932-419b-be31-9f38358d2db5","Type":"ContainerStarted","Data":"210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.921411 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerID="3ee66e2d9f3ff70ba95788fa2041bc3ab471615c0403f4144d9bc1ae897eb89c" exitCode=0 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.921478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" event={"ID":"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5","Type":"ContainerDied","Data":"3ee66e2d9f3ff70ba95788fa2041bc3ab471615c0403f4144d9bc1ae897eb89c"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.921505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" event={"ID":"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5","Type":"ContainerStarted","Data":"c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9b27\" (UniqueName: \"kubernetes.io/projected/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-kube-api-access-p9b27\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-ovndb-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931548 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-public-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.932104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-combined-ca-bundle\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.932256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-internal-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.932363 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-httpd-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerStarted","Data":"deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.933205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerStarted","Data":"242a8544a68d9d74ed6eb73bcacefdebb2fd4ae624de878f66d716d08691a8be"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.009110 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.034455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9b27\" (UniqueName: \"kubernetes.io/projected/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-kube-api-access-p9b27\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.034959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-ovndb-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-public-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-combined-ca-bundle\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-internal-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-httpd-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.040024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-httpd-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.042728 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-combined-ca-bundle\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.045283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-ovndb-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.045871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-public-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.049514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-internal-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.053450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.071029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9b27\" (UniqueName: \"kubernetes.io/projected/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-kube-api-access-p9b27\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.159502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.194084 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.357136 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:04 crc kubenswrapper[4722]: W0219 19:39:04.420777 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16a3e23c_d8b4_4030_ad8e_f12ffc069564.slice/crio-222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35 WatchSource:0}: Error finding container 222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35: Status 404 returned error can't find the container with id 222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35 Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.624202 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.807798 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.953723 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerID="6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0" exitCode=0 Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.953764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerDied","Data":"6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.956570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerStarted","Data":"222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.959038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" event={"ID":"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5","Type":"ContainerDied","Data":"c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.959075 4722 scope.go:117] "RemoveContainer" containerID="3ee66e2d9f3ff70ba95788fa2041bc3ab471615c0403f4144d9bc1ae897eb89c" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.959077 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.963965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerStarted","Data":"13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.964001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerStarted","Data":"ea7852dcadbb3212d9207882980f204f6c637ee58504de45986bee8494bbea9e"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.967617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerStarted","Data":"9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.967701 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.967748 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.975536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerStarted","Data":"39341497cf9614456c8136bec5d4742d83abccc290b3636722c56ae71d3a4127"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.980477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerStarted","Data":"7df30d9bfcde00a2b1fb449d6fdae155a98e793982f413e0d76e3453b4b0afd2"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.982811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.982873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.983289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.983318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.984243 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.984268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.984257 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-86mtg" podStartSLOduration=1.9842465379999998 podStartE2EDuration="1.984246538s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:04.980666476 +0000 UTC m=+1244.593016800" watchObservedRunningTime="2026-02-19 19:39:04.984246538 +0000 UTC m=+1244.596596852" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.987207 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx" (OuterVolumeSpecName: "kube-api-access-gxrcx") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "kube-api-access-gxrcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.023572 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59d6bc9fcb-2t849" podStartSLOduration=3.023548681 podStartE2EDuration="3.023548681s" podCreationTimestamp="2026-02-19 19:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:05.004491498 +0000 UTC m=+1244.616841822" watchObservedRunningTime="2026-02-19 19:39:05.023548681 +0000 UTC m=+1244.635899015" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.044383 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8694c7b8f7-2td8g"] Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.051849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config" (OuterVolumeSpecName: "config") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.054548 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.056895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.065529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.068464 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086373 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086401 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086414 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086425 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086435 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086444 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.430748 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.446876 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.859859 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:05.999949 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e73e983-eb03-4734-838f-85a759275b7a" containerID="b58a92dec7aa9fa905d85fdb92866ee986e06e5d51cc5b90911bb9db7cccb1d3" exitCode=0 Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.000048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerDied","Data":"b58a92dec7aa9fa905d85fdb92866ee986e06e5d51cc5b90911bb9db7cccb1d3"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.005846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8694c7b8f7-2td8g" event={"ID":"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b","Type":"ContainerStarted","Data":"a0ab87ecf0804ea35d3ab983301b5f69fe1b71f605261c3ce9bc8489326f6346"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.017504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.017890 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerID="e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0" exitCode=0 Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerDied","Data":"e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018629 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018703 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018776 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.019607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.019646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.026662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.028722 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.029980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts" (OuterVolumeSpecName: "scripts") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.031295 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerStarted","Data":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.035611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs" (OuterVolumeSpecName: "kube-api-access-6xsfs") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "kube-api-access-6xsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044327 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" exitCode=0 Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"1faa29ce27320ab22dc6db2828db88d540021f7a0832148de51b439f8684b1f0"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044682 4722 scope.go:117] "RemoveContainer" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044860 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.072221 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129241 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129532 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129545 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129558 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129569 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.159941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.172167 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data" (OuterVolumeSpecName: "config-data") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.232128 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.232185 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.271934 4722 scope.go:117] "RemoveContainer" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.285170 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.331607 4722 scope.go:117] "RemoveContainer" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.332820 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.332871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.332937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333600 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333795 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.337634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz" (OuterVolumeSpecName: "kube-api-access-c2qkz") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "kube-api-access-c2qkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.346433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.436802 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.436846 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.456405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.463887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.470491 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config" (OuterVolumeSpecName: "config") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.484415 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.505169 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540119 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540221 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540236 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540247 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540260 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.593300 4722 scope.go:117] "RemoveContainer" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.615268 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.626576 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650246 4722 scope.go:117] "RemoveContainer" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650277 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650670 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650681 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerName="init" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650705 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerName="init" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650724 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650731 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650745 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650750 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650762 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650768 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650786 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650794 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650800 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650968 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650979 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650988 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.651012 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.651024 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerName="init" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.651035 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.652687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.653038 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c\": container with ID starting with 81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c not found: ID does not exist" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.653069 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c"} err="failed to get container status \"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c\": rpc error: code = NotFound desc = could not find container \"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c\": container with ID starting with 81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.653090 4722 scope.go:117] "RemoveContainer" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.654713 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.655978 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.657972 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd\": container with ID starting with 5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd not found: ID does not exist" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.658026 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd"} err="failed to get container status \"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd\": rpc error: code = NotFound desc = could not find container \"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd\": container with ID starting with 5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.658058 4722 scope.go:117] "RemoveContainer" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.659589 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351\": container with ID starting with 891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351 not found: ID does not exist" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.659622 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351"} err="failed to get container status \"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351\": rpc error: code = NotFound desc = could not find container \"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351\": container with ID starting with 891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351 not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.659646 4722 scope.go:117] "RemoveContainer" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.659954 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14\": container with ID starting with 6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14 not found: ID does not exist" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.659978 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14"} err="failed to get container status \"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14\": rpc error: code = NotFound desc = could not find container \"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14\": container with ID starting with 6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14 not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.663217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.745566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.745724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.746945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747047 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747144 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747367 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.761162 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.848936 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.849005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.849034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.851103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.854658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.855847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.863923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.864920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.871459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.057591 4722 generic.go:334] "Generic (PLEG): container finished" podID="1725704f-c153-4de4-9246-87c6a5e878ea" containerID="13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187" exitCode=0 Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.057654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerDied","Data":"13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.063236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" event={"ID":"96ffdf9d-f932-419b-be31-9f38358d2db5","Type":"ContainerStarted","Data":"c18c2ab41b489e7badb5ac98b2e3c4606d65918c65f00e12eb38be57c9fae474"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.083732 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.088841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.119250 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" path="/var/lib/kubelet/pods/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5/volumes" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.119945 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" path="/var/lib/kubelet/pods/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16/volumes" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122557 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerStarted","Data":"fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" event={"ID":"66f5042d-2b30-4ac4-8594-cfc0f9590460","Type":"ContainerStarted","Data":"ed273f397b825fdd8bdb6f68de94a3ee0db9b04b68950081332c2fde978e7ba0"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8694c7b8f7-2td8g" event={"ID":"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b","Type":"ContainerStarted","Data":"a3bd7013e2ba7173aa28bb7c35ee99099b95a80e1d8988098d53c80782aa5146"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerDied","Data":"245b2a4bf08b03ca07fdc608528d3501f8e470227ac611d75e1e28818470fe64"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122927 4722 scope.go:117] "RemoveContainer" containerID="6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.137449 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" podStartSLOduration=4.137431076 podStartE2EDuration="4.137431076s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:07.107944168 +0000 UTC m=+1246.720294492" watchObservedRunningTime="2026-02-19 19:39:07.137431076 +0000 UTC m=+1246.749781400" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.149428 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.174429 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.175386 4722 scope.go:117] "RemoveContainer" containerID="e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.680244 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.096418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" event={"ID":"96ffdf9d-f932-419b-be31-9f38358d2db5","Type":"ContainerStarted","Data":"562d790cafd82cb261abee24047ccdf78c04de4bc01c721b2bc25d10c0b503fa"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.097894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"2fc5d288c8b590c8621fd130a7dd63655d59f6c92407b8882f0ffae525ddf63d"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.100391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" event={"ID":"66f5042d-2b30-4ac4-8594-cfc0f9590460","Type":"ContainerStarted","Data":"7e892a24818bdb1b487baebbcd5ac410109432e5dcc2bf11f41b117cd4d8ca06"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.103094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8694c7b8f7-2td8g" event={"ID":"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b","Type":"ContainerStarted","Data":"120a7a51f92ae71b94e908bdc8f0169f02eb2c6093c0a16f931af3f4da30a580"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.103303 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerStarted","Data":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106495 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" containerID="cri-o://f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" gracePeriod=30 Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106513 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106489 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" containerID="cri-o://ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" gracePeriod=30 Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.116511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerStarted","Data":"9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.116575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerStarted","Data":"923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.125787 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" podStartSLOduration=3.75351923 podStartE2EDuration="7.125770079s" podCreationTimestamp="2026-02-19 19:39:01 +0000 UTC" firstStartedPulling="2026-02-19 19:39:03.019216095 +0000 UTC m=+1242.631566419" lastFinishedPulling="2026-02-19 19:39:06.391466944 +0000 UTC m=+1246.003817268" observedRunningTime="2026-02-19 19:39:08.120387172 +0000 UTC m=+1247.732737516" watchObservedRunningTime="2026-02-19 19:39:08.125770079 +0000 UTC m=+1247.738120403" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.152704 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8694c7b8f7-2td8g" podStartSLOduration=5.152679786 podStartE2EDuration="5.152679786s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:08.145570495 +0000 UTC m=+1247.757920829" watchObservedRunningTime="2026-02-19 19:39:08.152679786 +0000 UTC m=+1247.765030110" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.170886 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.123168857 podStartE2EDuration="5.170868783s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="2026-02-19 19:39:04.224439426 +0000 UTC m=+1243.836789750" lastFinishedPulling="2026-02-19 19:39:06.272139352 +0000 UTC m=+1245.884489676" observedRunningTime="2026-02-19 19:39:08.169011904 +0000 UTC m=+1247.781362238" watchObservedRunningTime="2026-02-19 19:39:08.170868783 +0000 UTC m=+1247.783219127" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.209046 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" podStartSLOduration=3.593083886 podStartE2EDuration="7.209027309s" podCreationTimestamp="2026-02-19 19:39:01 +0000 UTC" firstStartedPulling="2026-02-19 19:39:02.755675724 +0000 UTC m=+1242.368026048" lastFinishedPulling="2026-02-19 19:39:06.371619147 +0000 UTC m=+1245.983969471" observedRunningTime="2026-02-19 19:39:08.198552904 +0000 UTC m=+1247.810903238" watchObservedRunningTime="2026-02-19 19:39:08.209027309 +0000 UTC m=+1247.821377643" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.275258 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.275232029 podStartE2EDuration="5.275232029s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:08.245405331 +0000 UTC m=+1247.857755655" watchObservedRunningTime="2026-02-19 19:39:08.275232029 +0000 UTC m=+1247.887582363" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.510380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.749117 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.789920 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-546c4d4684-6vk7j"] Feb 19 19:39:08 crc kubenswrapper[4722]: E0219 19:39:08.790381 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" containerName="cloudkitty-storageinit" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.790394 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" containerName="cloudkitty-storageinit" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.790589 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" containerName="cloudkitty-storageinit" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.791643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.795259 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.795411 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.832208 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-546c4d4684-6vk7j"] Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.856700 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.904197 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.904270 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.904289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905305 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905997 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7701b23-dddb-4a45-8982-11ab69bc30b1-logs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-public-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data-custom\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-internal-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvm4\" (UniqueName: \"kubernetes.io/projected/a7701b23-dddb-4a45-8982-11ab69bc30b1-kube-api-access-ptvm4\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906700 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-combined-ca-bundle\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.914280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj" (OuterVolumeSpecName: "kube-api-access-zw9tj") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "kube-api-access-zw9tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.918580 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts" (OuterVolumeSpecName: "scripts") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.919222 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs" (OuterVolumeSpecName: "certs") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.948767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data" (OuterVolumeSpecName: "config-data") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.951523 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008567 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008613 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008683 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008728 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008790 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7701b23-dddb-4a45-8982-11ab69bc30b1-logs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-public-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data-custom\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009397 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-internal-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvm4\" (UniqueName: \"kubernetes.io/projected/a7701b23-dddb-4a45-8982-11ab69bc30b1-kube-api-access-ptvm4\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-combined-ca-bundle\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009563 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009581 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009590 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009601 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009609 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009571 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.010208 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs" (OuterVolumeSpecName: "logs") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.015932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7701b23-dddb-4a45-8982-11ab69bc30b1-logs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.019042 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k" (OuterVolumeSpecName: "kube-api-access-7l94k") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "kube-api-access-7l94k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.025913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-public-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.025979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-combined-ca-bundle\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.026007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data-custom\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.026019 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts" (OuterVolumeSpecName: "scripts") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.026619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-internal-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.027719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.032963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.040100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvm4\" (UniqueName: \"kubernetes.io/projected/a7701b23-dddb-4a45-8982-11ab69bc30b1-kube-api-access-ptvm4\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.050388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.082398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data" (OuterVolumeSpecName: "config-data") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.084133 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" path="/var/lib/kubelet/pods/5c88f138-094d-44c0-b1c9-1492e7e11e9b/volumes" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112002 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112032 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112042 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112052 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112062 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112070 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112080 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.140962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.152485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161041 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" exitCode=0 Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161073 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" exitCode=143 Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161197 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerDied","Data":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerDied","Data":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerDied","Data":"222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161257 4722 scope.go:117] "RemoveContainer" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161395 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.174978 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.176115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerDied","Data":"ea7852dcadbb3212d9207882980f204f6c637ee58504de45986bee8494bbea9e"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.176219 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7852dcadbb3212d9207882980f204f6c637ee58504de45986bee8494bbea9e" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.230294 4722 scope.go:117] "RemoveContainer" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.282466 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.312342 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.315323 4722 scope.go:117] "RemoveContainer" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.316759 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": container with ID starting with f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0 not found: ID does not exist" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.316799 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} err="failed to get container status \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": rpc error: code = NotFound desc = could not find container \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": container with ID starting with f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.316822 4722 scope.go:117] "RemoveContainer" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.317928 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": container with ID starting with ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967 not found: ID does not exist" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.317971 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} err="failed to get container status \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": rpc error: code = NotFound desc = could not find container \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": container with ID starting with ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.317997 4722 scope.go:117] "RemoveContainer" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.318552 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} err="failed to get container status \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": rpc error: code = NotFound desc = could not find container \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": container with ID starting with f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.318580 4722 scope.go:117] "RemoveContainer" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.318938 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} err="failed to get container status \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": rpc error: code = NotFound desc = could not find container \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": container with ID starting with ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.355811 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.356222 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356236 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.356256 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356263 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356439 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356468 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.357749 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.368402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.368664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.368864 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.385112 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.421481 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.422666 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.440556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.451751 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452107 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bnkq4" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452485 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.461389 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.461623 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" containerID="cri-o://fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16" gracePeriod=10 Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-scripts\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523964 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8e6512-8007-4e99-8589-8dccb1975e3f-logs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524091 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4frh\" (UniqueName: \"kubernetes.io/projected/8c8e6512-8007-4e99-8589-8dccb1975e3f-kube-api-access-p4frh\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8e6512-8007-4e99-8589-8dccb1975e3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.606136 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.607964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.637868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-scripts\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638220 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8e6512-8007-4e99-8589-8dccb1975e3f-logs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638290 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638307 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638364 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4frh\" (UniqueName: \"kubernetes.io/projected/8c8e6512-8007-4e99-8589-8dccb1975e3f-kube-api-access-p4frh\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8e6512-8007-4e99-8589-8dccb1975e3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638609 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.639821 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8e6512-8007-4e99-8589-8dccb1975e3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.640513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8e6512-8007-4e99-8589-8dccb1975e3f-logs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.654800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.656787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.661813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.665454 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.669703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.671919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.677106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.677564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.682898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4frh\" (UniqueName: \"kubernetes.io/projected/8c8e6512-8007-4e99-8589-8dccb1975e3f-kube-api-access-p4frh\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.696861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.710772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.735837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.741613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.742123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.742678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.754379 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.754782 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.754934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-scripts\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.780780 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.798470 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.799036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.800040 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.832193 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.835884 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.837648 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.889017 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946570 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.984048 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e73e983_eb03_4734_838f_85a759275b7a.slice/crio-fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.988298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051471 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.052296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.061013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.061432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.062872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.079748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.079954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.088705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.256928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-546c4d4684-6vk7j"] Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.279365 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e73e983-eb03-4734-838f-85a759275b7a" containerID="fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16" exitCode=0 Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.279399 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerDied","Data":"fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16"} Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.285134 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5"} Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.327204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.693384 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.784949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.785603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.785811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.785935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.786037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.786822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.800494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz" (OuterVolumeSpecName: "kube-api-access-4zffz") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "kube-api-access-4zffz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.890663 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.902724 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:10 crc kubenswrapper[4722]: W0219 19:39:10.921641 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8676c8db_d85f_44d2_ae94_560542a5cbf3.slice/crio-cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812 WatchSource:0}: Error finding container cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812: Status 404 returned error can't find the container with id cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812 Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.937657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.961598 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.968640 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.992308 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.011684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.016181 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.021886 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config" (OuterVolumeSpecName: "config") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.023616 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093661 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093685 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093700 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093708 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.116522 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" path="/var/lib/kubelet/pods/16a3e23c-d8b4-4030-ad8e-f12ffc069564/volumes" Feb 19 19:39:11 crc kubenswrapper[4722]: W0219 19:39:11.182292 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d55206_1b8d_4013_a42b_d7e634815929.slice/crio-0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2 WatchSource:0}: Error finding container 0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2: Status 404 returned error can't find the container with id 0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2 Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.230945 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.307011 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerDied","Data":"39341497cf9614456c8136bec5d4742d83abccc290b3636722c56ae71d3a4127"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.307059 4722 scope.go:117] "RemoveContainer" containerID="fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.307215 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.315882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerStarted","Data":"0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.319711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c8e6512-8007-4e99-8589-8dccb1975e3f","Type":"ContainerStarted","Data":"3379a200763a43f81a93e7484422905bc009b41cd937c2876aa4df4396fe51aa"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.351990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.360408 4722 generic.go:334] "Generic (PLEG): container finished" podID="8f530e65-8397-49d6-929a-201bb5dfe585" containerID="380c536ebfd3cf4e5ded9eb26bb64cd838a985f8d5ba0c199a97d05a07b511f3" exitCode=0 Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.360489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerDied","Data":"380c536ebfd3cf4e5ded9eb26bb64cd838a985f8d5ba0c199a97d05a07b511f3"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.360514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerStarted","Data":"6a529cc3a96af23463f3dfa462bf02cb46f29fb8e36534fccb322ef7ab7a6728"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.381718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546c4d4684-6vk7j" event={"ID":"a7701b23-dddb-4a45-8982-11ab69bc30b1","Type":"ContainerStarted","Data":"61aaad90ae962a83d80291b7e325d625dae8793c7d5caf63aeaf7adc9417ebd3"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.381776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546c4d4684-6vk7j" event={"ID":"a7701b23-dddb-4a45-8982-11ab69bc30b1","Type":"ContainerStarted","Data":"b5f7b05354b2ad7eb50e5be20f018c5ba3940cc461bcda1e9b151c7a789fd61d"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.381790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546c4d4684-6vk7j" event={"ID":"a7701b23-dddb-4a45-8982-11ab69bc30b1","Type":"ContainerStarted","Data":"50659fb6df6ab070768ddc57a8fa622cf5a2601aadff9c10ef50cb62fbc11144"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.382757 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.382789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.411279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerStarted","Data":"cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.459997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-546c4d4684-6vk7j" podStartSLOduration=3.459981016 podStartE2EDuration="3.459981016s" podCreationTimestamp="2026-02-19 19:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:11.431431567 +0000 UTC m=+1251.043781901" watchObservedRunningTime="2026-02-19 19:39:11.459981016 +0000 UTC m=+1251.072331340" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.596307 4722 scope.go:117] "RemoveContainer" containerID="b58a92dec7aa9fa905d85fdb92866ee986e06e5d51cc5b90911bb9db7cccb1d3" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.629559 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.650950 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.803700 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.803748 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.803785 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.804556 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.804603 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5" gracePeriod=600 Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.389237 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.459400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c8e6512-8007-4e99-8589-8dccb1975e3f","Type":"ContainerStarted","Data":"231b9df70e244176ab3c47cc1a307eb91b71a9d5a2d108d53dcbbf64a3791510"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471246 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5" exitCode=0 Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471340 4722 scope.go:117] "RemoveContainer" containerID="d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.476412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerStarted","Data":"b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.476514 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.525619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerStarted","Data":"ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.525654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerStarted","Data":"72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.525668 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.545987 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" podStartSLOduration=3.5459657570000003 podStartE2EDuration="3.545965757s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:12.540107925 +0000 UTC m=+1252.152458259" watchObservedRunningTime="2026-02-19 19:39:12.545965757 +0000 UTC m=+1252.158316081" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.571259 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.571238553 podStartE2EDuration="3.571238553s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:12.564449332 +0000 UTC m=+1252.176799656" watchObservedRunningTime="2026-02-19 19:39:12.571238553 +0000 UTC m=+1252.183588877" Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.087089 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e73e983-eb03-4734-838f-85a759275b7a" path="/var/lib/kubelet/pods/2e73e983-eb03-4734-838f-85a759275b7a/volumes" Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.545033 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" containerID="cri-o://72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951" gracePeriod=30 Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.545392 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" containerID="cri-o://ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a" gracePeriod=30 Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.796483 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.834723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.501687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561853 4722 generic.go:334] "Generic (PLEG): container finished" podID="e7d55206-1b8d-4013-a42b-d7e634815929" containerID="ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a" exitCode=0 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561880 4722 generic.go:334] "Generic (PLEG): container finished" podID="e7d55206-1b8d-4013-a42b-d7e634815929" containerID="72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951" exitCode=143 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerDied","Data":"ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a"} Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerDied","Data":"72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951"} Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.563789 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c8e6512-8007-4e99-8589-8dccb1975e3f","Type":"ContainerStarted","Data":"0c9fd2d9bee615421145ea3e792710d0810ce10840062bfbc82e03a581b134a9"} Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.563927 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" containerID="cri-o://923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f" gracePeriod=30 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.563978 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" containerID="cri-o://9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd" gracePeriod=30 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.599905 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.599887296 podStartE2EDuration="5.599887296s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:14.585311712 +0000 UTC m=+1254.197662036" watchObservedRunningTime="2026-02-19 19:39:14.599887296 +0000 UTC m=+1254.212237620" Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.841261 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.923678 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.029093 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121223 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121410 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.127619 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs" (OuterVolumeSpecName: "logs") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.155373 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.156944 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb" (OuterVolumeSpecName: "kube-api-access-jtbhb") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "kube-api-access-jtbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.164273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts" (OuterVolumeSpecName: "scripts") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.165280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs" (OuterVolumeSpecName: "certs") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.199403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.209699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data" (OuterVolumeSpecName: "config-data") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224074 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224115 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224125 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224134 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224145 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224212 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224220 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.573054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerStarted","Data":"d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.574999 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerID="9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd" exitCode=0 Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.575018 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerID="923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f" exitCode=0 Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.575060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerDied","Data":"9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.575078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerDied","Data":"923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.576735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerDied","Data":"0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.576793 4722 scope.go:117] "RemoveContainer" containerID="ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.576746 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.589392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.599525 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.874803734 podStartE2EDuration="6.5994976s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="2026-02-19 19:39:10.927807627 +0000 UTC m=+1250.540157951" lastFinishedPulling="2026-02-19 19:39:14.652501493 +0000 UTC m=+1254.264851817" observedRunningTime="2026-02-19 19:39:15.595655191 +0000 UTC m=+1255.208005515" watchObservedRunningTime="2026-02-19 19:39:15.5994976 +0000 UTC m=+1255.211847924" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.647228 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.708325 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.747559 4722 scope.go:117] "RemoveContainer" containerID="72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.769670 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796227 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796729 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796758 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="init" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796765 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="init" Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796781 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796788 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796795 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.797034 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.797056 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.797066 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.798181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.812078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.862505704 podStartE2EDuration="9.812056854s" podCreationTimestamp="2026-02-19 19:39:06 +0000 UTC" firstStartedPulling="2026-02-19 19:39:07.700376993 +0000 UTC m=+1247.312727317" lastFinishedPulling="2026-02-19 19:39:14.649928143 +0000 UTC m=+1254.262278467" observedRunningTime="2026-02-19 19:39:15.708606815 +0000 UTC m=+1255.320957139" watchObservedRunningTime="2026-02-19 19:39:15.812056854 +0000 UTC m=+1255.424407178" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.816115 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.816267 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.816372 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.852688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.944940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046723 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046820 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046869 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.047070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.047127 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.047434 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.054762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.054868 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.062025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.062313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.062611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.064430 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.065919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.066458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.162138 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.170279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.251980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.256142 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.260424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.261305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts" (OuterVolumeSpecName: "scripts") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.266910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd" (OuterVolumeSpecName: "kube-api-access-xq4kd") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "kube-api-access-xq4kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.347924 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355432 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355469 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355481 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355493 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355502 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.402636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data" (OuterVolumeSpecName: "config-data") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.457820 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.600087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerDied","Data":"7df30d9bfcde00a2b1fb449d6fdae155a98e793982f413e0d76e3453b4b0afd2"} Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.600100 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.600462 4722 scope.go:117] "RemoveContainer" containerID="9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.601698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.635265 4722 scope.go:117] "RemoveContainer" containerID="923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.636129 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.652297 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.664382 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: E0219 19:39:16.664910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.664933 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" Feb 19 19:39:16 crc kubenswrapper[4722]: E0219 19:39:16.664963 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.664972 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.665235 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.665257 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.666587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.670590 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.678216 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.756215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: W0219 19:39:16.761354 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57386acb_6299_4fd3_80a2_25d8769dcc93.slice/crio-42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802 WatchSource:0}: Error finding container 42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802: Status 404 returned error can't find the container with id 42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802 Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.765576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rszkk\" (UniqueName: \"kubernetes.io/projected/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-kube-api-access-rszkk\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.765923 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.765988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.767046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.767083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.767103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rszkk\" (UniqueName: \"kubernetes.io/projected/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-kube-api-access-rszkk\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.869029 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.869205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.874339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.875245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.875367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.889666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.897729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rszkk\" (UniqueName: \"kubernetes.io/projected/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-kube-api-access-rszkk\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.053008 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.117192 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" path="/var/lib/kubelet/pods/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d/volumes" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.118044 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" path="/var/lib/kubelet/pods/e7d55206-1b8d-4013-a42b-d7e634815929/volumes" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerStarted","Data":"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971"} Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerStarted","Data":"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422"} Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerStarted","Data":"42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802"} Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628857 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628963 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" containerID="cri-o://d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471" gracePeriod=30 Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.660475 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.660460258 podStartE2EDuration="2.660460258s" podCreationTimestamp="2026-02-19 19:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:17.657475256 +0000 UTC m=+1257.269825580" watchObservedRunningTime="2026-02-19 19:39:17.660460258 +0000 UTC m=+1257.272810582" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.798507 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:18 crc kubenswrapper[4722]: I0219 19:39:18.646785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"afcc30d0-b94c-4bf7-8736-fb35bc461fa2","Type":"ContainerStarted","Data":"d1764b20f34ed6d85d0d47b8b2899873a0c19b1fb3f82cf9b3b2d74b7a687bc6"} Feb 19 19:39:18 crc kubenswrapper[4722]: I0219 19:39:18.647274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"afcc30d0-b94c-4bf7-8736-fb35bc461fa2","Type":"ContainerStarted","Data":"caafa2143ff64269289e010c971ce0519a5814d460892533e177e0083afeaebe"} Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.668587 4722 generic.go:334] "Generic (PLEG): container finished" podID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerID="d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471" exitCode=0 Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.668956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerDied","Data":"d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471"} Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.684326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"afcc30d0-b94c-4bf7-8736-fb35bc461fa2","Type":"ContainerStarted","Data":"b3b3e2358eab6f69738470fbb9c2f2d49b4d091b93d31cbbe4fde26000e442b3"} Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.727930 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.727905399 podStartE2EDuration="3.727905399s" podCreationTimestamp="2026-02-19 19:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:19.719228689 +0000 UTC m=+1259.331579053" watchObservedRunningTime="2026-02-19 19:39:19.727905399 +0000 UTC m=+1259.340255723" Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.975655 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.990311 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.076361 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.076673 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" containerID="cri-o://0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06" gracePeriod=10 Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101295 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101555 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101761 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.119334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng" (OuterVolumeSpecName: "kube-api-access-p26ng") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "kube-api-access-p26ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.124691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs" (OuterVolumeSpecName: "certs") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.125325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.134315 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts" (OuterVolumeSpecName: "scripts") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.197255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.203970 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204000 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204009 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204018 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204027 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.216815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data" (OuterVolumeSpecName: "config-data") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.308135 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.563724 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf618be57_2b9f_4455_8de0_90379bc9d57b.slice/crio-0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.696469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerDied","Data":"cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812"} Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.696558 4722 scope.go:117] "RemoveContainer" containerID="d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.697787 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701136 4722 generic.go:334] "Generic (PLEG): container finished" podID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerID="0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06" exitCode=0 Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerDied","Data":"0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06"} Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerDied","Data":"b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea"} Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701260 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.760694 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.779938 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.815779 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.826901 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.827421 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.827467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827476 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.827494 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="init" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827501 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="init" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827788 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827812 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.828652 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.830784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839499 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839525 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.847463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42" (OuterVolumeSpecName: "kube-api-access-jsg42") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "kube-api-access-jsg42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.860891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.904238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.919047 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.923463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.934476 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.941404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.941646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.941856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942421 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942495 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942558 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942614 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942669 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.945362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config" (OuterVolumeSpecName: "config") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045942 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.048802 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.049385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.049741 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.054186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.059613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.060587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.065625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.065828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.093648 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" path="/var/lib/kubelet/pods/8676c8db-d85f-44d2-ae94-560542a5cbf3/volumes" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.172764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.311944 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.389798 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.390012 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" containerID="cri-o://deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c" gracePeriod=30 Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.390442 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" containerID="cri-o://9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26" gracePeriod=30 Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.681278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:21 crc kubenswrapper[4722]: W0219 19:39:21.683667 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00bbae7e_ebc6_4102_9398_fc131546bbf5.slice/crio-fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb WatchSource:0}: Error finding container fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb: Status 404 returned error can't find the container with id fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.720414 4722 generic.go:334] "Generic (PLEG): container finished" podID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerID="deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c" exitCode=143 Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.720499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerDied","Data":"deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c"} Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.725797 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.726985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerStarted","Data":"fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb"} Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.790217 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.802951 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.054550 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.595405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.751334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerStarted","Data":"20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e"} Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.785673 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.7856581030000003 podStartE2EDuration="2.785658103s" podCreationTimestamp="2026-02-19 19:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:22.773808075 +0000 UTC m=+1262.386158399" watchObservedRunningTime="2026-02-19 19:39:22.785658103 +0000 UTC m=+1262.398008427" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.083744 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" path="/var/lib/kubelet/pods/f618be57-2b9f-4455-8de0-90379bc9d57b/volumes" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.340420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.462036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.481547 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.857540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.942126 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.388377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.599885 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:48414->10.217.0.178:9311: read: connection reset by peer" Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.599899 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:48400->10.217.0.178:9311: read: connection reset by peer" Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774073 4722 generic.go:334] "Generic (PLEG): container finished" podID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerID="9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26" exitCode=0 Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774311 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7cc6894556-2r5j6" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" containerID="cri-o://c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" gracePeriod=30 Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerDied","Data":"9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26"} Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774880 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7cc6894556-2r5j6" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" containerID="cri-o://95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" gracePeriod=30 Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.369385 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461885 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.462081 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs" (OuterVolumeSpecName: "logs") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.463367 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.469289 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.487400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v" (OuterVolumeSpecName: "kube-api-access-tlq7v") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "kube-api-access-tlq7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.525392 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data" (OuterVolumeSpecName: "config-data") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.528848 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565190 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565229 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565244 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565257 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.808292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerDied","Data":"242a8544a68d9d74ed6eb73bcacefdebb2fd4ae624de878f66d716d08691a8be"} Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.808439 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.808562 4722 scope.go:117] "RemoveContainer" containerID="9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.814470 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" exitCode=143 Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.814502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerDied","Data":"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df"} Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.847395 4722 scope.go:117] "RemoveContainer" containerID="deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.854025 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.860979 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:27 crc kubenswrapper[4722]: I0219 19:39:27.086349 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" path="/var/lib/kubelet/pods/da70c61d-7b82-48ee-bce0-53e96df3442d/volumes" Feb 19 19:39:27 crc kubenswrapper[4722]: I0219 19:39:27.271494 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.466134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527493 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs" (OuterVolumeSpecName: "logs") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.528075 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.551145 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr" (OuterVolumeSpecName: "kube-api-access-x57pr") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "kube-api-access-x57pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.552118 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts" (OuterVolumeSpecName: "scripts") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.584625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data" (OuterVolumeSpecName: "config-data") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.598259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630145 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630203 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630213 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630225 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.653350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.663318 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664066 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664212 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664311 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664398 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664491 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664569 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664688 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664776 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665168 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665267 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665356 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665442 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.666481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.668301 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.669125 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-czh7m" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.669396 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.669431 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.675807 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.732343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.732693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.732879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.733183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.733373 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.733389 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.835649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.838645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.842627 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848292 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" exitCode=0 Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerDied","Data":"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d"} Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerDied","Data":"c688d661fba42ba2a53e010b04af9f22dbacb7137f02c088f90b0645fc7ab228"} Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848367 4722 scope.go:117] "RemoveContainer" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848457 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.856122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.895993 4722 scope.go:117] "RemoveContainer" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.896491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.915816 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.924643 4722 scope.go:117] "RemoveContainer" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.925318 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d\": container with ID starting with 95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d not found: ID does not exist" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.925517 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d"} err="failed to get container status \"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d\": rpc error: code = NotFound desc = could not find container \"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d\": container with ID starting with 95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d not found: ID does not exist" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.925676 4722 scope.go:117] "RemoveContainer" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.926231 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df\": container with ID starting with c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df not found: ID does not exist" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.926411 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df"} err="failed to get container status \"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df\": rpc error: code = NotFound desc = could not find container \"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df\": container with ID starting with c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df not found: ID does not exist" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.948463 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.949879 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.959611 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.013874 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.017638 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.062967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.085872 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" path="/var/lib/kubelet/pods/e0e1ecfc-6394-4815-bf10-7623a5359525/volumes" Feb 19 19:39:29 crc kubenswrapper[4722]: E0219 19:39:29.138244 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 19:39:29 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_54745880-0d6d-432b-be90-a609a4f4bff6_0(6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e" Netns:"/var/run/netns/f605bd00-826b-4a6a-99e5-c59bf30ac4e0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e;K8S_POD_UID=54745880-0d6d-432b-be90-a609a4f4bff6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/54745880-0d6d-432b-be90-a609a4f4bff6]: expected pod UID "54745880-0d6d-432b-be90-a609a4f4bff6" but got "af557f35-ca9e-4990-bdcb-9e44366dab68" from Kube API Feb 19 19:39:29 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:39:29 crc kubenswrapper[4722]: > Feb 19 19:39:29 crc kubenswrapper[4722]: E0219 19:39:29.138315 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 19:39:29 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_54745880-0d6d-432b-be90-a609a4f4bff6_0(6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e" Netns:"/var/run/netns/f605bd00-826b-4a6a-99e5-c59bf30ac4e0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e;K8S_POD_UID=54745880-0d6d-432b-be90-a609a4f4bff6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/54745880-0d6d-432b-be90-a609a4f4bff6]: expected pod UID "54745880-0d6d-432b-be90-a609a4f4bff6" but got "af557f35-ca9e-4990-bdcb-9e44366dab68" from Kube API Feb 19 19:39:29 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:39:29 crc kubenswrapper[4722]: > pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmg9j\" (UniqueName: \"kubernetes.io/projected/af557f35-ca9e-4990-bdcb-9e44366dab68-kube-api-access-lmg9j\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config-secret\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config-secret\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmg9j\" (UniqueName: \"kubernetes.io/projected/af557f35-ca9e-4990-bdcb-9e44366dab68-kube-api-access-lmg9j\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.249585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.252986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.253542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config-secret\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.266033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmg9j\" (UniqueName: \"kubernetes.io/projected/af557f35-ca9e-4990-bdcb-9e44366dab68-kube-api-access-lmg9j\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.507801 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.858724 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.870610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.877814 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54745880-0d6d-432b-be90-a609a4f4bff6" podUID="af557f35-ca9e-4990-bdcb-9e44366dab68" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962057 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962253 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962462 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.963132 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.968731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj" (OuterVolumeSpecName: "kube-api-access-gwssj") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "kube-api-access-gwssj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.969412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.971288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: W0219 19:39:29.989349 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf557f35_ca9e_4990_bdcb_9e44366dab68.slice/crio-ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7 WatchSource:0}: Error finding container ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7: Status 404 returned error can't find the container with id ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7 Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.993641 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065561 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065600 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065613 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065625 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.869768 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.871333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"af557f35-ca9e-4990-bdcb-9e44366dab68","Type":"ContainerStarted","Data":"ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7"} Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.884800 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54745880-0d6d-432b-be90-a609a4f4bff6" podUID="af557f35-ca9e-4990-bdcb-9e44366dab68" Feb 19 19:39:31 crc kubenswrapper[4722]: I0219 19:39:31.085336 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54745880-0d6d-432b-be90-a609a4f4bff6" path="/var/lib/kubelet/pods/54745880-0d6d-432b-be90-a609a4f4bff6/volumes" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325085 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325490 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" containerID="cri-o://281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325651 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" containerID="cri-o://e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325726 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" containerID="cri-o://870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325393 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" containerID="cri-o://51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.335274 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.184:3000/\": EOF" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.630569 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-b7b95d7bc-zqb9x"] Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.633376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.638046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.638059 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.638972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.656545 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b7b95d7bc-zqb9x"] Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722778 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsn6q\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-kube-api-access-tsn6q\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722846 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-etc-swift\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-run-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-log-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-combined-ca-bundle\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-public-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-internal-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723207 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-config-data\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-config-data\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsn6q\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-kube-api-access-tsn6q\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-etc-swift\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-run-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-log-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-combined-ca-bundle\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825607 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-public-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-internal-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.826408 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-run-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.827476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-log-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.832726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-internal-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.832768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-public-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.834330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-config-data\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.834326 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-combined-ca-bundle\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.835181 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-etc-swift\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.843274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsn6q\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-kube-api-access-tsn6q\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890416 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" exitCode=0 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890447 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" exitCode=2 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890456 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" exitCode=0 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df"} Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b"} Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d"} Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.959733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:33 crc kubenswrapper[4722]: I0219 19:39:33.548659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b7b95d7bc-zqb9x"] Feb 19 19:39:33 crc kubenswrapper[4722]: W0219 19:39:33.567243 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42a3f824_28fe_4734_8ada_a74ffb9930a8.slice/crio-9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d WatchSource:0}: Error finding container 9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d: Status 404 returned error can't find the container with id 9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d Feb 19 19:39:33 crc kubenswrapper[4722]: I0219 19:39:33.908755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" event={"ID":"42a3f824-28fe-4734-8ada-a74ffb9930a8","Type":"ContainerStarted","Data":"9bb6330deff38fd0d18ecd03ae5cb24c9789fc21f1750510e2039924d227c72d"} Feb 19 19:39:33 crc kubenswrapper[4722]: I0219 19:39:33.909171 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" event={"ID":"42a3f824-28fe-4734-8ada-a74ffb9930a8","Type":"ContainerStarted","Data":"9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.177601 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.255351 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.255798 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7445db86-7r6w9" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" containerID="cri-o://6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da" gracePeriod=30 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.256197 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7445db86-7r6w9" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" containerID="cri-o://df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887" gracePeriod=30 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.906213 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.934132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" event={"ID":"42a3f824-28fe-4734-8ada-a74ffb9930a8","Type":"ContainerStarted","Data":"79793a96c41434eb0bb076408cbe18073688933274ac3aeaeacb989b708b5d57"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.935040 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.935068 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.937512 4722 generic.go:334] "Generic (PLEG): container finished" podID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerID="df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887" exitCode=0 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.937556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerDied","Data":"df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961397 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" exitCode=0 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"2fc5d288c8b590c8621fd130a7dd63655d59f6c92407b8882f0ffae525ddf63d"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961495 4722 scope.go:117] "RemoveContainer" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.980769 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" podStartSLOduration=2.980737873 podStartE2EDuration="2.980737873s" podCreationTimestamp="2026-02-19 19:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:34.978325528 +0000 UTC m=+1274.590675872" watchObservedRunningTime="2026-02-19 19:39:34.980737873 +0000 UTC m=+1274.593088197" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994741 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.995689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.997049 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.001610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts" (OuterVolumeSpecName: "scripts") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.008382 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb" (OuterVolumeSpecName: "kube-api-access-72xgb") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "kube-api-access-72xgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.036023 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097138 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097190 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097200 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097209 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097220 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.126781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data" (OuterVolumeSpecName: "config-data") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.140691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.198572 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.198611 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.321517 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.335176 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349297 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349759 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349773 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349803 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349808 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349824 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349831 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349840 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349846 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350017 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350032 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350055 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.351916 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.357214 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.358460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.361472 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.606974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607137 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607170 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.608482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.613329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.614005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.616841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.617187 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.647741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.679791 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:36 crc kubenswrapper[4722]: I0219 19:39:36.197121 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9696/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:39:37 crc kubenswrapper[4722]: I0219 19:39:37.088878 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41000a66-e725-4b1e-ab9c-31251213e311" path="/var/lib/kubelet/pods/41000a66-e725-4b1e-ab9c-31251213e311/volumes" Feb 19 19:39:38 crc kubenswrapper[4722]: I0219 19:39:38.374569 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:38 crc kubenswrapper[4722]: I0219 19:39:38.375196 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" containerID="cri-o://0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a" gracePeriod=30 Feb 19 19:39:38 crc kubenswrapper[4722]: I0219 19:39:38.375363 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" containerID="cri-o://3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae" gracePeriod=30 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.011015 4722 generic.go:334] "Generic (PLEG): container finished" podID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerID="6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da" exitCode=0 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.011086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerDied","Data":"6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da"} Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.013682 4722 generic.go:334] "Generic (PLEG): container finished" podID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerID="0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a" exitCode=143 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.013725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerDied","Data":"0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a"} Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.135454 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.135685 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" containerID="cri-o://b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" gracePeriod=30 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.135813 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" containerID="cri-o://1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" gracePeriod=30 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.997190 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.028674 4722 generic.go:334] "Generic (PLEG): container finished" podID="58e51a47-7d37-46de-96cc-609365fab496" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" exitCode=143 Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.028715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerDied","Data":"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd"} Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.642651 4722 scope.go:117] "RemoveContainer" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.791693 4722 scope.go:117] "RemoveContainer" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.817675 4722 scope.go:117] "RemoveContainer" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.937503 4722 scope.go:117] "RemoveContainer" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.940977 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df\": container with ID starting with e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df not found: ID does not exist" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941057 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df"} err="failed to get container status \"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df\": rpc error: code = NotFound desc = could not find container \"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df\": container with ID starting with e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df not found: ID does not exist" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941121 4722 scope.go:117] "RemoveContainer" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.941519 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b\": container with ID starting with 281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b not found: ID does not exist" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941544 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b"} err="failed to get container status \"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b\": rpc error: code = NotFound desc = could not find container \"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b\": container with ID starting with 281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b not found: ID does not exist" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941561 4722 scope.go:117] "RemoveContainer" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.942045 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5\": container with ID starting with 870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5 not found: ID does not exist" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.942067 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5"} err="failed to get container status \"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5\": rpc error: code = NotFound desc = could not find container \"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5\": container with ID starting with 870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5 not found: ID does not exist" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.942082 4722 scope.go:117] "RemoveContainer" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.944230 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d\": container with ID starting with 51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d not found: ID does not exist" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.944274 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d"} err="failed to get container status \"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d\": rpc error: code = NotFound desc = could not find container \"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d\": container with ID starting with 51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d not found: ID does not exist" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.044881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerDied","Data":"0d269d0087152d6edd92c6c1c2324f5e6566d6cbbbcd03d88628b974769fb6f5"} Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.044917 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d269d0087152d6edd92c6c1c2324f5e6566d6cbbbcd03d88628b974769fb6f5" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.046063 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.054141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"af557f35-ca9e-4990-bdcb-9e44366dab68","Type":"ContainerStarted","Data":"b233726d8e4e40dc0e1cb1f2de9faba0ca9d794275a44bf5c5a4deb84d8f9b15"} Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.119570 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.432882422 podStartE2EDuration="13.119554797s" podCreationTimestamp="2026-02-19 19:39:28 +0000 UTC" firstStartedPulling="2026-02-19 19:39:29.991377654 +0000 UTC m=+1269.603727978" lastFinishedPulling="2026-02-19 19:39:40.678050039 +0000 UTC m=+1280.290400353" observedRunningTime="2026-02-19 19:39:41.096425808 +0000 UTC m=+1280.708776182" watchObservedRunningTime="2026-02-19 19:39:41.119554797 +0000 UTC m=+1280.731905121" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121634 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.128854 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r" (OuterVolumeSpecName: "kube-api-access-dbx7r") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "kube-api-access-dbx7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.130884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.185316 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.211779 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.220794 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config" (OuterVolumeSpecName: "config") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224734 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: W0219 19:39:41.224750 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3534949_6af7_4bf0_ba36_ed96804ada1b.slice/crio-ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab WatchSource:0}: Error finding container ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab: Status 404 returned error can't find the container with id ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224759 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224793 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224808 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.229645 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.326562 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.062183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab"} Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066180 4722 generic.go:334] "Generic (PLEG): container finished" podID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerID="3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae" exitCode=0 Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerDied","Data":"3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae"} Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerDied","Data":"8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d"} Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066552 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066437 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.068890 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.137637 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144508 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144990 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.145017 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.145772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.145899 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs" (OuterVolumeSpecName: "logs") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.148672 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.148694 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.153619 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.166409 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts" (OuterVolumeSpecName: "scripts") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.167265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t" (OuterVolumeSpecName: "kube-api-access-h6q6t") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "kube-api-access-h6q6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.189082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (OuterVolumeSpecName: "glance") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.217891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.219270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data" (OuterVolumeSpecName: "config-data") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.247665 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250388 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250422 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250434 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250469 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250484 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250496 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.295616 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.295778 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d") on node "crc" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.306134 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": read tcp 10.217.0.2:41080->10.217.0.166:9292: read: connection reset by peer" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.306560 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": read tcp 10.217.0.2:41094->10.217.0.166:9292: read: connection reset by peer" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.352029 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.923690 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063419 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063695 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063876 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.064001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.064142 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.065757 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.075690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs" (OuterVolumeSpecName: "logs") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.094002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts" (OuterVolumeSpecName: "scripts") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.110582 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np" (OuterVolumeSpecName: "kube-api-access-g46np") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "kube-api-access-g46np". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.140981 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" path="/var/lib/kubelet/pods/cff58b5f-4c6b-44be-b668-15b2948e6af0/volumes" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.156893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.166333 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167015 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167109 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167183 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167246 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.179690 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.181407 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184589 4722 generic.go:334] "Generic (PLEG): container finished" podID="58e51a47-7d37-46de-96cc-609365fab496" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" exitCode=0 Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerDied","Data":"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7"} Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184698 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerDied","Data":"9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac"} Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184716 4722 scope.go:117] "RemoveContainer" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184848 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.194455 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.197828 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8"} Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.226135 4722 scope.go:117] "RemoveContainer" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.288340 4722 scope.go:117] "RemoveContainer" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.288720 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7\": container with ID starting with 1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7 not found: ID does not exist" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.288829 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7"} err="failed to get container status \"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7\": rpc error: code = NotFound desc = could not find container \"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7\": container with ID starting with 1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7 not found: ID does not exist" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.288906 4722 scope.go:117] "RemoveContainer" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.289259 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd\": container with ID starting with b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd not found: ID does not exist" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.289303 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd"} err="failed to get container status \"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd\": rpc error: code = NotFound desc = could not find container \"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd\": container with ID starting with b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd not found: ID does not exist" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.300666 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.341620 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355057 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355511 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355534 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355547 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355553 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355565 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355572 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355603 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355609 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355622 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355631 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355637 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355799 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355812 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355823 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355832 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355853 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355865 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.356961 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.359579 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.359815 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.363752 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (OuterVolumeSpecName: "glance") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.376706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.378216 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.434724 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.462546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data" (OuterVolumeSpecName: "config-data") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.473190 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.473339 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5") on node "crc" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479759 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqkj\" (UniqueName: \"kubernetes.io/projected/84bb340d-f999-45fc-8e1c-d813e2ad4319-kube-api-access-qpqkj\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479825 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-config-data\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-logs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-scripts\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.480009 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.480021 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.480030 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqkj\" (UniqueName: \"kubernetes.io/projected/84bb340d-f999-45fc-8e1c-d813e2ad4319-kube-api-access-qpqkj\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-config-data\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-logs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-scripts\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.582496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.586916 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-logs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.588543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.589429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.589542 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.591968 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-scripts\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.592869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-config-data\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.598472 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.598521 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2019fecbddc337ddf53783637eb0008bc901e49a55294deb1e2d06fbb77c3ae3/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.602563 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.606831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqkj\" (UniqueName: \"kubernetes.io/projected/84bb340d-f999-45fc-8e1c-d813e2ad4319-kube-api-access-qpqkj\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.640081 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.675130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.678896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.680345 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.728288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.752301 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800471 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800550 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800568 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-logs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8lg\" (UniqueName: \"kubernetes.io/projected/99490f57-22ed-4652-a112-bf45feb67aee-kube-api-access-cv8lg\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.901989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-logs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8lg\" (UniqueName: \"kubernetes.io/projected/99490f57-22ed-4652-a112-bf45feb67aee-kube-api-access-cv8lg\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.903550 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-logs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.903839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.907928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.912927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.913642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.926078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.927534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8lg\" (UniqueName: \"kubernetes.io/projected/99490f57-22ed-4652-a112-bf45feb67aee-kube-api-access-cv8lg\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.930676 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.930718 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b323df4ccd136fd865256cd83fe693e56c32fbc8a05d96b41caf6babb703da86/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.979245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.064037 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.072263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.286533 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636"} Feb 19 19:39:44 crc kubenswrapper[4722]: W0219 19:39:44.674312 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bb340d_f999_45fc_8e1c_d813e2ad4319.slice/crio-e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976 WatchSource:0}: Error finding container e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976: Status 404 returned error can't find the container with id e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976 Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.675440 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.846200 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:44 crc kubenswrapper[4722]: W0219 19:39:44.855223 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99490f57_22ed_4652_a112_bf45feb67aee.slice/crio-ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd WatchSource:0}: Error finding container ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd: Status 404 returned error can't find the container with id ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.085123 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e51a47-7d37-46de-96cc-609365fab496" path="/var/lib/kubelet/pods/58e51a47-7d37-46de-96cc-609365fab496/volumes" Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.086969 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" path="/var/lib/kubelet/pods/74c5d98f-45b4-4fd8-876b-3471da720a4b/volumes" Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.298751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84bb340d-f999-45fc-8e1c-d813e2ad4319","Type":"ContainerStarted","Data":"e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976"} Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.300022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99490f57-22ed-4652-a112-bf45feb67aee","Type":"ContainerStarted","Data":"ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd"} Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.304489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56"} Feb 19 19:39:46 crc kubenswrapper[4722]: I0219 19:39:46.324374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99490f57-22ed-4652-a112-bf45feb67aee","Type":"ContainerStarted","Data":"3a7b5fad0bd64aaf40effe613e01c3aabc29d539bcc949fb8f17a519a11b024b"} Feb 19 19:39:46 crc kubenswrapper[4722]: I0219 19:39:46.327951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84bb340d-f999-45fc-8e1c-d813e2ad4319","Type":"ContainerStarted","Data":"44101d7618ca249c3c5c563fb8a0acf458f204e06915400900d870e20bf3d61c"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.338881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99490f57-22ed-4652-a112-bf45feb67aee","Type":"ContainerStarted","Data":"870ba3af4dad9e4cb08a056b1ebb31973272630f093731881261d0918622a9a2"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341646 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" containerID="cri-o://a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341674 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341641 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" containerID="cri-o://14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341702 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" containerID="cri-o://2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341696 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" containerID="cri-o://1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.345412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84bb340d-f999-45fc-8e1c-d813e2ad4319","Type":"ContainerStarted","Data":"d99be33c8ebb5d272bc11d22e7f63115904f5bfad3e3f567d9aa8b4615656edf"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.362264 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.362247634 podStartE2EDuration="4.362247634s" podCreationTimestamp="2026-02-19 19:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:47.361144959 +0000 UTC m=+1286.973495293" watchObservedRunningTime="2026-02-19 19:39:47.362247634 +0000 UTC m=+1286.974597958" Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.383995 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.3839812 podStartE2EDuration="4.3839812s" podCreationTimestamp="2026-02-19 19:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:47.383471814 +0000 UTC m=+1286.995822138" watchObservedRunningTime="2026-02-19 19:39:47.3839812 +0000 UTC m=+1286.996331524" Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.417618 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.881998181 podStartE2EDuration="12.417596996s" podCreationTimestamp="2026-02-19 19:39:35 +0000 UTC" firstStartedPulling="2026-02-19 19:39:41.229354073 +0000 UTC m=+1280.841704397" lastFinishedPulling="2026-02-19 19:39:46.764952868 +0000 UTC m=+1286.377303212" observedRunningTime="2026-02-19 19:39:47.410616538 +0000 UTC m=+1287.022966872" watchObservedRunningTime="2026-02-19 19:39:47.417596996 +0000 UTC m=+1287.029947330" Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.357807 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0" exitCode=0 Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0"} Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56"} Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358026 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56" exitCode=2 Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358094 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636" exitCode=0 Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636"} Feb 19 19:39:53 crc kubenswrapper[4722]: I0219 19:39:53.402285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 19 19:39:53 crc kubenswrapper[4722]: I0219 19:39:53.979745 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:39:53 crc kubenswrapper[4722]: I0219 19:39:53.980076 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.025700 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.036355 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.073160 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.073211 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.132917 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.163126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423860 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423918 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.655193 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.655765 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.861863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.861955 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.885920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:57 crc kubenswrapper[4722]: I0219 19:39:57.534071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491366 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8" exitCode=0 Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8"} Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab"} Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491970 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.559759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.668791 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.668857 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.668914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669041 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669166 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669239 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669902 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.670563 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.670763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.678884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5" (OuterVolumeSpecName: "kube-api-access-dcll5") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "kube-api-access-dcll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.694382 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts" (OuterVolumeSpecName: "scripts") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.775359 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.775584 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.775648 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.815413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.850341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.868824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data" (OuterVolumeSpecName: "config-data") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.877359 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.877499 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.877571 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.509988 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.548370 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.560763 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.578811 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579326 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579352 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579363 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579371 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579408 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579417 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579428 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579434 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579663 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579685 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579704 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579729 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.584368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.590646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.590856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.613820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.695128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.695347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.695498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.796992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797621 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.798012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.798026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.802673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.802883 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.807957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.809816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.817753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.907123 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:02 crc kubenswrapper[4722]: I0219 19:40:02.413033 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:02 crc kubenswrapper[4722]: W0219 19:40:02.415054 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e17a08d_48a9_43c6_acd3_5bcc13df91df.slice/crio-d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243 WatchSource:0}: Error finding container d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243: Status 404 returned error can't find the container with id d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243 Feb 19 19:40:02 crc kubenswrapper[4722]: I0219 19:40:02.527089 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243"} Feb 19 19:40:03 crc kubenswrapper[4722]: I0219 19:40:03.083825 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" path="/var/lib/kubelet/pods/d3534949-6af7-4bf0-ba36-ed96804ada1b/volumes" Feb 19 19:40:03 crc kubenswrapper[4722]: I0219 19:40:03.542542 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84"} Feb 19 19:40:04 crc kubenswrapper[4722]: I0219 19:40:04.567615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0"} Feb 19 19:40:04 crc kubenswrapper[4722]: I0219 19:40:04.567966 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597"} Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.394720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.396307 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.407033 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.495533 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.498255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.508322 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.508381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.541140 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.615645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.627529 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.629144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.631598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.640038 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.643246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.717213 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.718556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.718920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.719006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.721752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.721821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.722938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.734139 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.757639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.820518 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.822183 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.826651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.845745 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.847045 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.848302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.850654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.873873 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.933727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.933801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.933949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.934044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.935926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.990470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.024401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.035946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.036047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.036936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.057670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.059054 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.061968 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.067434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.070069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.091589 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.152617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.241362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.241542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.343916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.344043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.344859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.383286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.395043 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.527300 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.675217 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4"} Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.676781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.692744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nq58z" event={"ID":"823fc346-84d0-4920-bc42-ec213d0c6eef","Type":"ContainerStarted","Data":"9321baf0393d59bba73cb4fc60396e56bf1d2d8783d6fe2b8d651c63240d3d1c"} Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.699624 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.730672 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.814398166 podStartE2EDuration="6.730649033s" podCreationTimestamp="2026-02-19 19:40:01 +0000 UTC" firstStartedPulling="2026-02-19 19:40:02.417527811 +0000 UTC m=+1302.029878135" lastFinishedPulling="2026-02-19 19:40:06.333778678 +0000 UTC m=+1305.946129002" observedRunningTime="2026-02-19 19:40:07.713670354 +0000 UTC m=+1307.326020678" watchObservedRunningTime="2026-02-19 19:40:07.730649033 +0000 UTC m=+1307.342999357" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.978713 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.150873 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.161519 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:40:08 crc kubenswrapper[4722]: W0219 19:40:08.221814 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a72e03c_87f6_4d54_8ea1_f8abed33bd2c.slice/crio-7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75 WatchSource:0}: Error finding container 7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75: Status 404 returned error can't find the container with id 7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75 Feb 19 19:40:08 crc kubenswrapper[4722]: W0219 19:40:08.269437 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f262eb9_64a7_4b10_85f9_4bc43d512f60.slice/crio-3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d WatchSource:0}: Error finding container 3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d: Status 404 returned error can't find the container with id 3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.358990 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:40:08 crc kubenswrapper[4722]: W0219 19:40:08.367950 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b8ebb77_caea_46ca_8989_d2dd37bf2df5.slice/crio-d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255 WatchSource:0}: Error finding container d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255: Status 404 returned error can't find the container with id d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.708714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerStarted","Data":"bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.709115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerStarted","Data":"d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.711215 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerStarted","Data":"a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.711256 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerStarted","Data":"3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.715309 4722 generic.go:334] "Generic (PLEG): container finished" podID="84699ef3-8d21-4493-8875-81de167ee617" containerID="02fdf9891e0a4a5e6a9cd6279f1ac5170d3eaad2e2904682a600a6d410fb2a19" exitCode=0 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.715732 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4fzxz" event={"ID":"84699ef3-8d21-4493-8875-81de167ee617","Type":"ContainerDied","Data":"02fdf9891e0a4a5e6a9cd6279f1ac5170d3eaad2e2904682a600a6d410fb2a19"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.715991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4fzxz" event={"ID":"84699ef3-8d21-4493-8875-81de167ee617","Type":"ContainerStarted","Data":"6429e62e2e41653fc6362e37e869fbccbb59b4e67585320b1880dd9be47080f2"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.719602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" event={"ID":"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c","Type":"ContainerStarted","Data":"7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.726101 4722 generic.go:334] "Generic (PLEG): container finished" podID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerID="34ce6fe937d88e617e83f04f4163bf9713e6cac4114d5734077d30be33461dbc" exitCode=0 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.726428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nq58z" event={"ID":"823fc346-84d0-4920-bc42-ec213d0c6eef","Type":"ContainerDied","Data":"34ce6fe937d88e617e83f04f4163bf9713e6cac4114d5734077d30be33461dbc"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.730134 4722 generic.go:334] "Generic (PLEG): container finished" podID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerID="59b7ab3b9b5c89b55e17c8616e639ea24cc02e1ca89d3d887ff255092c310b2a" exitCode=0 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.730985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l92p9" event={"ID":"c6e27062-a94f-4d8d-8a07-b940d9aa572e","Type":"ContainerDied","Data":"59b7ab3b9b5c89b55e17c8616e639ea24cc02e1ca89d3d887ff255092c310b2a"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.731019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l92p9" event={"ID":"c6e27062-a94f-4d8d-8a07-b940d9aa572e","Type":"ContainerStarted","Data":"3111868eed1037ee898be61210727fc20dfdcb07c463168784d2422bc46d76bc"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.744606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-cda6-account-create-update-45ddh" podStartSLOduration=2.744583622 podStartE2EDuration="2.744583622s" podCreationTimestamp="2026-02-19 19:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:08.72554052 +0000 UTC m=+1308.337890854" watchObservedRunningTime="2026-02-19 19:40:08.744583622 +0000 UTC m=+1308.356933946" Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.790890 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-bc61-account-create-update-km828" podStartSLOduration=2.790865372 podStartE2EDuration="2.790865372s" podCreationTimestamp="2026-02-19 19:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:08.775071401 +0000 UTC m=+1308.387421745" watchObservedRunningTime="2026-02-19 19:40:08.790865372 +0000 UTC m=+1308.403215706" Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.742815 4722 generic.go:334] "Generic (PLEG): container finished" podID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerID="a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3" exitCode=0 Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.742889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerDied","Data":"a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3"} Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.744858 4722 generic.go:334] "Generic (PLEG): container finished" podID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerID="0c662d869f0260b21b14e815b1c26ef3d995bd4318e89a8c8d85dd5703eaa89e" exitCode=0 Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.744928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" event={"ID":"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c","Type":"ContainerDied","Data":"0c662d869f0260b21b14e815b1c26ef3d995bd4318e89a8c8d85dd5703eaa89e"} Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.746560 4722 generic.go:334] "Generic (PLEG): container finished" podID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerID="bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783" exitCode=0 Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.746672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerDied","Data":"bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.310551 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.427336 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"84699ef3-8d21-4493-8875-81de167ee617\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.427511 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"84699ef3-8d21-4493-8875-81de167ee617\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.427993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84699ef3-8d21-4493-8875-81de167ee617" (UID: "84699ef3-8d21-4493-8875-81de167ee617"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.428238 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.435393 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l" (OuterVolumeSpecName: "kube-api-access-kqt6l") pod "84699ef3-8d21-4493-8875-81de167ee617" (UID: "84699ef3-8d21-4493-8875-81de167ee617"). InnerVolumeSpecName "kube-api-access-kqt6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.487513 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.491620 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529210 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"823fc346-84d0-4920-bc42-ec213d0c6eef\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"823fc346-84d0-4920-bc42-ec213d0c6eef\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "823fc346-84d0-4920-bc42-ec213d0c6eef" (UID: "823fc346-84d0-4920-bc42-ec213d0c6eef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.530200 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6e27062-a94f-4d8d-8a07-b940d9aa572e" (UID: "c6e27062-a94f-4d8d-8a07-b940d9aa572e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.530270 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.530285 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.533347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp" (OuterVolumeSpecName: "kube-api-access-dq5kp") pod "c6e27062-a94f-4d8d-8a07-b940d9aa572e" (UID: "c6e27062-a94f-4d8d-8a07-b940d9aa572e"). InnerVolumeSpecName "kube-api-access-dq5kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.535872 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n" (OuterVolumeSpecName: "kube-api-access-j5t2n") pod "823fc346-84d0-4920-bc42-ec213d0c6eef" (UID: "823fc346-84d0-4920-bc42-ec213d0c6eef"). InnerVolumeSpecName "kube-api-access-j5t2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.632044 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.632095 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.632109 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.756460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4fzxz" event={"ID":"84699ef3-8d21-4493-8875-81de167ee617","Type":"ContainerDied","Data":"6429e62e2e41653fc6362e37e869fbccbb59b4e67585320b1880dd9be47080f2"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.756497 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6429e62e2e41653fc6362e37e869fbccbb59b4e67585320b1880dd9be47080f2" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.756503 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.757942 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nq58z" event={"ID":"823fc346-84d0-4920-bc42-ec213d0c6eef","Type":"ContainerDied","Data":"9321baf0393d59bba73cb4fc60396e56bf1d2d8783d6fe2b8d651c63240d3d1c"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.757978 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9321baf0393d59bba73cb4fc60396e56bf1d2d8783d6fe2b8d651c63240d3d1c" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.757950 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.759607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l92p9" event={"ID":"c6e27062-a94f-4d8d-8a07-b940d9aa572e","Type":"ContainerDied","Data":"3111868eed1037ee898be61210727fc20dfdcb07c463168784d2422bc46d76bc"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.759632 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3111868eed1037ee898be61210727fc20dfdcb07c463168784d2422bc46d76bc" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.759679 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.152850 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.244385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.244824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.245656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f262eb9-64a7-4b10-85f9-4bc43d512f60" (UID: "3f262eb9-64a7-4b10-85f9-4bc43d512f60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.270631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf" (OuterVolumeSpecName: "kube-api-access-4zxdf") pod "3f262eb9-64a7-4b10-85f9-4bc43d512f60" (UID: "3f262eb9-64a7-4b10-85f9-4bc43d512f60"). InnerVolumeSpecName "kube-api-access-4zxdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.347606 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.347895 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.380049 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.386962 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448630 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.449768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" (UID: "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.449842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b8ebb77-caea-46ca-8989-d2dd37bf2df5" (UID: "5b8ebb77-caea-46ca-8989-d2dd37bf2df5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.452551 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs" (OuterVolumeSpecName: "kube-api-access-4gvfs") pod "5b8ebb77-caea-46ca-8989-d2dd37bf2df5" (UID: "5b8ebb77-caea-46ca-8989-d2dd37bf2df5"). InnerVolumeSpecName "kube-api-access-4gvfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.453846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2" (OuterVolumeSpecName: "kube-api-access-7ptb2") pod "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" (UID: "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c"). InnerVolumeSpecName "kube-api-access-7ptb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551572 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551623 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551637 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551648 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.782704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerDied","Data":"d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255"} Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.782745 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.782870 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.785625 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.785899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerDied","Data":"3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d"} Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.785940 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.788581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" event={"ID":"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c","Type":"ContainerDied","Data":"7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75"} Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.788614 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.788720 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.811449 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.811820 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": dial tcp 10.217.0.167:9292: i/o timeout" Feb 19 19:40:12 crc kubenswrapper[4722]: E0219 19:40:12.034240 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b8ebb77_caea_46ca_8989_d2dd37bf2df5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a72e03c_87f6_4d54_8ea1_f8abed33bd2c.slice\": RecentStats: unable to find data in memory cache]" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.196574 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200808 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84699ef3-8d21-4493-8875-81de167ee617" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200824 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="84699ef3-8d21-4493-8875-81de167ee617" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200834 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200840 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200854 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200860 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200871 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200876 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200888 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200895 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200905 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200910 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201097 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="84699ef3-8d21-4493-8875-81de167ee617" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201106 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201121 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201132 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201145 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201172 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.204927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6spgl" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.205297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.207647 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.217611 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.337001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.337838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.344963 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.350533 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.523635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:18 crc kubenswrapper[4722]: I0219 19:40:18.184224 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:40:18 crc kubenswrapper[4722]: W0219 19:40:18.199395 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2859f56_714b_43b5_bb67_6ee5493d4f11.slice/crio-6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb WatchSource:0}: Error finding container 6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb: Status 404 returned error can't find the container with id 6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb Feb 19 19:40:18 crc kubenswrapper[4722]: I0219 19:40:18.863742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerStarted","Data":"6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb"} Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.016086 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.016981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" containerID="cri-o://9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.017734 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" containerID="cri-o://94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.017788 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" containerID="cri-o://d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.017886 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" containerID="cri-o://974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.026647 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.199:3000/\": EOF" Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.921795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4"} Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.921814 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" exitCode=0 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922190 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" exitCode=2 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922204 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" exitCode=0 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0"} Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84"} Feb 19 19:40:25 crc kubenswrapper[4722]: I0219 19:40:25.932139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerStarted","Data":"0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e"} Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.554566 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.591017 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cddxh" podStartSLOduration=2.9299823849999997 podStartE2EDuration="9.590992917s" podCreationTimestamp="2026-02-19 19:40:17 +0000 UTC" firstStartedPulling="2026-02-19 19:40:18.202261006 +0000 UTC m=+1317.814611330" lastFinishedPulling="2026-02-19 19:40:24.863271538 +0000 UTC m=+1324.475621862" observedRunningTime="2026-02-19 19:40:25.956619409 +0000 UTC m=+1325.568969753" watchObservedRunningTime="2026-02-19 19:40:26.590992917 +0000 UTC m=+1326.203343241" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630850 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630978 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.631029 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.631128 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.632929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.633323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.646328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts" (OuterVolumeSpecName: "scripts") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.646495 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9" (OuterVolumeSpecName: "kube-api-access-w7dx9") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "kube-api-access-w7dx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.676721 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.718803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733395 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733586 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733676 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733738 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733803 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733860 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.802311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data" (OuterVolumeSpecName: "config-data") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.835754 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.942439 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" exitCode=0 Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.942593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597"} Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.944084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243"} Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.944182 4722 scope.go:117] "RemoveContainer" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.942673 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.979704 4722 scope.go:117] "RemoveContainer" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.989016 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.015256 4722 scope.go:117] "RemoveContainer" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.060989 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.061360 4722 scope.go:117] "RemoveContainer" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066234 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066637 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066660 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066686 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066711 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066719 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066725 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066938 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066961 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066969 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.068803 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.070676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.071985 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.085162 4722 scope.go:117] "RemoveContainer" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.085583 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" path="/var/lib/kubelet/pods/3e17a08d-48a9-43c6-acd3-5bcc13df91df/volumes" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.086001 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4\": container with ID starting with d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4 not found: ID does not exist" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.086097 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4"} err="failed to get container status \"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4\": rpc error: code = NotFound desc = could not find container \"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4\": container with ID starting with d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.088514 4722 scope.go:117] "RemoveContainer" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.086324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.089537 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0\": container with ID starting with 974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0 not found: ID does not exist" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.089649 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0"} err="failed to get container status \"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0\": rpc error: code = NotFound desc = could not find container \"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0\": container with ID starting with 974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.089747 4722 scope.go:117] "RemoveContainer" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.090434 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597\": container with ID starting with 94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597 not found: ID does not exist" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.090480 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597"} err="failed to get container status \"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597\": rpc error: code = NotFound desc = could not find container \"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597\": container with ID starting with 94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.090508 4722 scope.go:117] "RemoveContainer" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.090772 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84\": container with ID starting with 9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84 not found: ID does not exist" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.090853 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84"} err="failed to get container status \"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84\": rpc error: code = NotFound desc = could not find container \"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84\": container with ID starting with 9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.169114 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.169191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271206 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271350 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271513 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.272039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.272081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.277172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.277199 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.278541 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.279015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.292678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.385690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: W0219 19:40:27.830002 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd4ffd8_1f63_4881_9774_9dda64b8ae5c.slice/crio-0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b WatchSource:0}: Error finding container 0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b: Status 404 returned error can't find the container with id 0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.833063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.956085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b"} Feb 19 19:40:28 crc kubenswrapper[4722]: I0219 19:40:28.969005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a"} Feb 19 19:40:30 crc kubenswrapper[4722]: I0219 19:40:30.996708 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193"} Feb 19 19:40:32 crc kubenswrapper[4722]: I0219 19:40:32.010032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1"} Feb 19 19:40:33 crc kubenswrapper[4722]: I0219 19:40:33.026661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099"} Feb 19 19:40:33 crc kubenswrapper[4722]: I0219 19:40:33.028064 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:40:33 crc kubenswrapper[4722]: I0219 19:40:33.068685 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.15073089 podStartE2EDuration="7.068657055s" podCreationTimestamp="2026-02-19 19:40:26 +0000 UTC" firstStartedPulling="2026-02-19 19:40:27.834833591 +0000 UTC m=+1327.447183925" lastFinishedPulling="2026-02-19 19:40:32.752759766 +0000 UTC m=+1332.365110090" observedRunningTime="2026-02-19 19:40:33.046674611 +0000 UTC m=+1332.659024975" watchObservedRunningTime="2026-02-19 19:40:33.068657055 +0000 UTC m=+1332.681007419" Feb 19 19:40:36 crc kubenswrapper[4722]: I0219 19:40:36.061598 4722 generic.go:334] "Generic (PLEG): container finished" podID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerID="0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e" exitCode=0 Feb 19 19:40:36 crc kubenswrapper[4722]: I0219 19:40:36.062004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerDied","Data":"0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e"} Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.499745 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.609783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.610086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.610178 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.610221 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.615136 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts" (OuterVolumeSpecName: "scripts") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.615965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg" (OuterVolumeSpecName: "kube-api-access-jj9lg") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "kube-api-access-jj9lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.639720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.642781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data" (OuterVolumeSpecName: "config-data") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712171 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712390 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712477 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712549 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.084000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerDied","Data":"6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb"} Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.084251 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.084138 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.196816 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 19:40:38 crc kubenswrapper[4722]: E0219 19:40:38.197228 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerName="nova-cell0-conductor-db-sync" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.197251 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerName="nova-cell0-conductor-db-sync" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.197488 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerName="nova-cell0-conductor-db-sync" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.198221 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.201520 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6spgl" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.202660 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.216895 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.323940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.324045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.324078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chqr\" (UniqueName: \"kubernetes.io/projected/69f96c80-f951-453b-9880-ecd0591dc1bf-kube-api-access-2chqr\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.426497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.427562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.427999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2chqr\" (UniqueName: \"kubernetes.io/projected/69f96c80-f951-453b-9880-ecd0591dc1bf-kube-api-access-2chqr\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.433524 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.438662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.446953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chqr\" (UniqueName: \"kubernetes.io/projected/69f96c80-f951-453b-9880-ecd0591dc1bf-kube-api-access-2chqr\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.522852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:39 crc kubenswrapper[4722]: I0219 19:40:39.092708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 19:40:39 crc kubenswrapper[4722]: W0219 19:40:39.094105 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f96c80_f951_453b_9880_ecd0591dc1bf.slice/crio-20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815 WatchSource:0}: Error finding container 20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815: Status 404 returned error can't find the container with id 20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815 Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.104613 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"69f96c80-f951-453b-9880-ecd0591dc1bf","Type":"ContainerStarted","Data":"538f2ff9d11fea42946598a8e0ff2a03260f9d88fac782cc737a7024d14f62df"} Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.104960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"69f96c80-f951-453b-9880-ecd0591dc1bf","Type":"ContainerStarted","Data":"20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815"} Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.104981 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.132087 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.132061939 podStartE2EDuration="2.132061939s" podCreationTimestamp="2026-02-19 19:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:40.120792088 +0000 UTC m=+1339.733142422" watchObservedRunningTime="2026-02-19 19:40:40.132061939 +0000 UTC m=+1339.744412293" Feb 19 19:40:48 crc kubenswrapper[4722]: I0219 19:40:48.563456 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.091566 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.093144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.096135 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.096714 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.110263 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253187 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253257 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.262701 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.264195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.266092 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.278313 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.354922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.355044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.355080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.355120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.366342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.367046 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.368884 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.373888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.375364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.383793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.385652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.387677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.451351 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.471005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.471050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.471106 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.485216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.486909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.491988 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.509340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.532283 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.533667 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.546780 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.597835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.597927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598207 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.602961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.603437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.627443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.632862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.635040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.641343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.651482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.655236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.686377 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.688346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.705022 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.708221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.708263 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.773669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810823 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811139 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.812832 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.818598 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.821651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.827108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.831574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.836277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.844538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.883834 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.924509 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.924813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.928772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.933717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.934304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.967119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.985445 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.005613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.025677 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.153767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:40:50 crc kubenswrapper[4722]: W0219 19:40:50.259254 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a230c6_6844_4483_a8b4_0ae8073dff8d.slice/crio-bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05 WatchSource:0}: Error finding container bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05: Status 404 returned error can't find the container with id bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05 Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.498612 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.720514 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.836266 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: W0219 19:40:50.836759 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb663fb91_fb60_451c_a9c9_7278dbd1c9ac.slice/crio-f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29 WatchSource:0}: Error finding container f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29: Status 404 returned error can't find the container with id f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29 Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.851811 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: W0219 19:40:50.855076 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14e3a56_49b6_4bc5_81f3_2e8b1da839b8.slice/crio-e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9 WatchSource:0}: Error finding container e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9: Status 404 returned error can't find the container with id e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9 Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.007140 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:40:51 crc kubenswrapper[4722]: W0219 19:40:51.007589 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e629ce1_0108_4450_bb62_44ca1d2993b6.slice/crio-665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06 WatchSource:0}: Error finding container 665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06: Status 404 returned error can't find the container with id 665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06 Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.008654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.010748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.010996 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.023595 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.040622 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.057002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.057069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.057132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.058330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.159985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.160076 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.160178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.160224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.164288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.164288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.170666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.194938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.252637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerStarted","Data":"f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.263683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerStarted","Data":"3e855c266781287503b5b752c8ff71d55312447fd33f6883decf51bbec4b4e45"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.278335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerStarted","Data":"665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.283678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerStarted","Data":"e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.286008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerStarted","Data":"e752336bd0fae94c244c48636db8246622b4108d65cd7eef8e5be29938427e5c"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.287843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerStarted","Data":"60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.287890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerStarted","Data":"bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.317858 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dzq9w" podStartSLOduration=2.317837374 podStartE2EDuration="2.317837374s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:51.305621103 +0000 UTC m=+1350.917971437" watchObservedRunningTime="2026-02-19 19:40:51.317837374 +0000 UTC m=+1350.930187698" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.328015 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.089904 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.337448 4722 generic.go:334] "Generic (PLEG): container finished" podID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" exitCode=0 Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.337779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerDied","Data":"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b"} Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.352630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerStarted","Data":"facd2fa08899d3f5f781d5d5581a5b5e4e0814ddf689e925edac6d9674d3eba2"} Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.986576 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.003242 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.365485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerStarted","Data":"b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340"} Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.368437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerStarted","Data":"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e"} Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.369412 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.392377 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" podStartSLOduration=3.392362034 podStartE2EDuration="3.392362034s" podCreationTimestamp="2026-02-19 19:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:53.381105594 +0000 UTC m=+1352.993455928" watchObservedRunningTime="2026-02-19 19:40:53.392362034 +0000 UTC m=+1353.004712348" Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.418536 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" podStartSLOduration=4.418516117 podStartE2EDuration="4.418516117s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:53.410349593 +0000 UTC m=+1353.022699907" watchObservedRunningTime="2026-02-19 19:40:53.418516117 +0000 UTC m=+1353.030866441" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.405298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerStarted","Data":"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerStarted","Data":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerStarted","Data":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407483 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" containerID="cri-o://b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" gracePeriod=30 Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407510 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" containerID="cri-o://9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" gracePeriod=30 Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.409791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerStarted","Data":"e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.409833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerStarted","Data":"33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.414845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerStarted","Data":"346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.414958 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0" gracePeriod=30 Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.433326 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.92319745 podStartE2EDuration="7.433306926s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.717273626 +0000 UTC m=+1350.329623950" lastFinishedPulling="2026-02-19 19:40:55.227383102 +0000 UTC m=+1354.839733426" observedRunningTime="2026-02-19 19:40:56.432660256 +0000 UTC m=+1356.045010600" watchObservedRunningTime="2026-02-19 19:40:56.433306926 +0000 UTC m=+1356.045657260" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.453847 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.064392233 podStartE2EDuration="7.453826244s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.838621352 +0000 UTC m=+1350.450971676" lastFinishedPulling="2026-02-19 19:40:55.228055323 +0000 UTC m=+1354.840405687" observedRunningTime="2026-02-19 19:40:56.45303178 +0000 UTC m=+1356.065382104" watchObservedRunningTime="2026-02-19 19:40:56.453826244 +0000 UTC m=+1356.066176568" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.486235 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.770788789 podStartE2EDuration="7.486217362s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.511941558 +0000 UTC m=+1350.124291892" lastFinishedPulling="2026-02-19 19:40:55.227370151 +0000 UTC m=+1354.839720465" observedRunningTime="2026-02-19 19:40:56.476372705 +0000 UTC m=+1356.088723039" watchObservedRunningTime="2026-02-19 19:40:56.486217362 +0000 UTC m=+1356.098567686" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.511656 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.1377026040000002 podStartE2EDuration="7.511602612s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.857274472 +0000 UTC m=+1350.469624796" lastFinishedPulling="2026-02-19 19:40:55.23117448 +0000 UTC m=+1354.843524804" observedRunningTime="2026-02-19 19:40:56.505025717 +0000 UTC m=+1356.117376031" watchObservedRunningTime="2026-02-19 19:40:56.511602612 +0000 UTC m=+1356.123952936" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.237780 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.394979 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422019 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422551 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.426417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs" (OuterVolumeSpecName: "logs") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.471759 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn" (OuterVolumeSpecName: "kube-api-access-m7fwn") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "kube-api-access-m7fwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.473390 4722 generic.go:334] "Generic (PLEG): container finished" podID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" exitCode=0 Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.473422 4722 generic.go:334] "Generic (PLEG): container finished" podID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" exitCode=143 Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerDied","Data":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerDied","Data":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerDied","Data":"e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9"} Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474899 4722 scope.go:117] "RemoveContainer" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.477308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.488323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data" (OuterVolumeSpecName: "config-data") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526584 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526598 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526610 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.659016 4722 scope.go:117] "RemoveContainer" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.703384 4722 scope.go:117] "RemoveContainer" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.704106 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": container with ID starting with 9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283 not found: ID does not exist" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704143 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} err="failed to get container status \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": rpc error: code = NotFound desc = could not find container \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": container with ID starting with 9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704183 4722 scope.go:117] "RemoveContainer" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.704539 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": container with ID starting with b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655 not found: ID does not exist" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704558 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} err="failed to get container status \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": rpc error: code = NotFound desc = could not find container \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": container with ID starting with b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704571 4722 scope.go:117] "RemoveContainer" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704758 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} err="failed to get container status \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": rpc error: code = NotFound desc = could not find container \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": container with ID starting with 9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704776 4722 scope.go:117] "RemoveContainer" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704926 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} err="failed to get container status \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": rpc error: code = NotFound desc = could not find container \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": container with ID starting with b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.817207 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.835723 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852290 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.852703 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852719 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.852751 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852758 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852948 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852959 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.856657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.860023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.860258 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.873050 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039322 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039763 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.145901 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.146197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.148346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.161269 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.188731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: W0219 19:40:58.674977 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ea5e74_4865_4550_bf03_3214021a9cda.slice/crio-2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d WatchSource:0}: Error finding container 2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d: Status 404 returned error can't find the container with id 2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.680183 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.081996 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" path="/var/lib/kubelet/pods/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8/volumes" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.502120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerStarted","Data":"808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.502193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerStarted","Data":"24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.502207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerStarted","Data":"2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.504536 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerID="60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1" exitCode=0 Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.504630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerDied","Data":"60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.523777 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.523760508 podStartE2EDuration="2.523760508s" podCreationTimestamp="2026-02-19 19:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:59.516884063 +0000 UTC m=+1359.129234387" watchObservedRunningTime="2026-02-19 19:40:59.523760508 +0000 UTC m=+1359.136110832" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.774665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.775368 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.884648 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.006820 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.006893 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.028384 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.058899 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.103126 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.103649 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" containerID="cri-o://b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac" gracePeriod=10 Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.516581 4722 generic.go:334] "Generic (PLEG): container finished" podID="8f530e65-8397-49d6-929a-201bb5dfe585" containerID="b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac" exitCode=0 Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.516783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerDied","Data":"b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac"} Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.580495 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.858583 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.859096 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.108296 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.205777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.205834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.205867 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.206022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.213460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2" (OuterVolumeSpecName: "kube-api-access-5mfl2") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "kube-api-access-5mfl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.240676 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts" (OuterVolumeSpecName: "scripts") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.246293 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.284429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data" (OuterVolumeSpecName: "config-data") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310195 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310229 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310268 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310278 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.420508 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514096 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514175 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514317 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.520509 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx" (OuterVolumeSpecName: "kube-api-access-5kgkx") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "kube-api-access-5kgkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.545711 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.546743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerDied","Data":"bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05"} Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.546811 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.550090 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.550702 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerDied","Data":"6a529cc3a96af23463f3dfa462bf02cb46f29fb8e36534fccb322ef7ab7a6728"} Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.550747 4722 scope.go:117] "RemoveContainer" containerID="b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.591346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config" (OuterVolumeSpecName: "config") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.608756 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.617933 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.617961 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.617971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.632658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.655730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.668730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.703454 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.719823 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.719856 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.719869 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.733102 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.733365 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" containerID="cri-o://24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.733799 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" containerID="cri-o://808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.741317 4722 scope.go:117] "RemoveContainer" containerID="380c536ebfd3cf4e5ded9eb26bb64cd838a985f8d5ba0c199a97d05a07b511f3" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.754107 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.754338 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" containerID="cri-o://33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.754464 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" containerID="cri-o://e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.885621 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.895400 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.349932 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.350137 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" containerID="cri-o://3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2" gracePeriod=30 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.559485 4722 generic.go:334] "Generic (PLEG): container finished" podID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerID="33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3" exitCode=143 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.559574 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerDied","Data":"33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.561724 4722 generic.go:334] "Generic (PLEG): container finished" podID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerID="3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2" exitCode=2 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.561799 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerDied","Data":"3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565579 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerID="808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f" exitCode=0 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565607 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerID="24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813" exitCode=143 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerDied","Data":"808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565698 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerDied","Data":"24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565751 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" containerID="cri-o://87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" gracePeriod=30 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.967047 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043765 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.044523 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs" (OuterVolumeSpecName: "logs") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.051344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t" (OuterVolumeSpecName: "kube-api-access-l498t") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "kube-api-access-l498t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.074793 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.089929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data" (OuterVolumeSpecName: "config-data") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.104043 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" path="/var/lib/kubelet/pods/8f530e65-8397-49d6-929a-201bb5dfe585/volumes" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.104200 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.147964 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148003 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148015 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148025 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148036 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.585784 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerDied","Data":"2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d"} Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.585844 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.586097 4722 scope.go:117] "RemoveContainer" containerID="808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.755451 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.770099 4722 scope.go:117] "RemoveContainer" containerID="24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.772254 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.783569 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794296 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794707 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerName="nova-manage" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794722 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerName="nova-manage" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794737 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794766 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794772 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794785 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794791 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794806 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="init" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="init" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794827 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794833 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795011 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795018 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795032 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795045 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerName="nova-manage" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.796069 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.804187 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.804318 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.830064 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865067 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865620 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.890732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8" (OuterVolumeSpecName: "kube-api-access-8c6z8") pod "14a7aae0-6a51-49ed-b4dd-9b274885d1da" (UID: "14a7aae0-6a51-49ed-b4dd-9b274885d1da"). InnerVolumeSpecName "kube-api-access-8c6z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.967932 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968104 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968223 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.973611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.979466 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.983719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.986829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.131406 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.146292 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.172404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"0b7f5812-df88-4652-85af-75b6b7f994ee\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.172607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"0b7f5812-df88-4652-85af-75b6b7f994ee\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.172633 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"0b7f5812-df88-4652-85af-75b6b7f994ee\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.212706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc" (OuterVolumeSpecName: "kube-api-access-9lfmc") pod "0b7f5812-df88-4652-85af-75b6b7f994ee" (UID: "0b7f5812-df88-4652-85af-75b6b7f994ee"). InnerVolumeSpecName "kube-api-access-9lfmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.224691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b7f5812-df88-4652-85af-75b6b7f994ee" (UID: "0b7f5812-df88-4652-85af-75b6b7f994ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.238596 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data" (OuterVolumeSpecName: "config-data") pod "0b7f5812-df88-4652-85af-75b6b7f994ee" (UID: "0b7f5812-df88-4652-85af-75b6b7f994ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.275841 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.276186 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.276198 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.596952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerDied","Data":"5545dc8f3e2de249c7840626da07d4ee4ba5dd553856353617c9f89c2873d54d"} Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.596979 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.597015 4722 scope.go:117] "RemoveContainer" containerID="3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599296 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" exitCode=0 Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerDied","Data":"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4"} Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599381 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerDied","Data":"3e855c266781287503b5b752c8ff71d55312447fd33f6883decf51bbec4b4e45"} Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.630735 4722 scope.go:117] "RemoveContainer" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.635515 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.649124 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.661015 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.682728 4722 scope.go:117] "RemoveContainer" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" Feb 19 19:41:04 crc kubenswrapper[4722]: E0219 19:41:04.683364 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4\": container with ID starting with 87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4 not found: ID does not exist" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.683405 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4"} err="failed to get container status \"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4\": rpc error: code = NotFound desc = could not find container \"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4\": container with ID starting with 87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4 not found: ID does not exist" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.692213 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.702644 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: E0219 19:41:04.703119 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.703131 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.703341 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.704066 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.706029 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.713554 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.723256 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.724642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.732229 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.732600 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.732872 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.742681 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77ghz\" (UniqueName: \"kubernetes.io/projected/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-api-access-77ghz\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807365 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807502 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909587 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77ghz\" (UniqueName: \"kubernetes.io/projected/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-api-access-77ghz\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.911622 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.911655 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.919285 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.920875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.923818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.927637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.939849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77ghz\" (UniqueName: \"kubernetes.io/projected/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-api-access-77ghz\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.948206 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.971956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.079886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.093649 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" path="/var/lib/kubelet/pods/0b7f5812-df88-4652-85af-75b6b7f994ee/volumes" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.097295 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" path="/var/lib/kubelet/pods/14a7aae0-6a51-49ed-b4dd-9b274885d1da/volumes" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.097506 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.097936 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" path="/var/lib/kubelet/pods/e3ea5e74-4865-4550-bf03-3214021a9cda/volumes" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.611145 4722 generic.go:334] "Generic (PLEG): container finished" podID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerID="b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340" exitCode=0 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.611322 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerDied","Data":"b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.624656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerStarted","Data":"a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.624705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerStarted","Data":"9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.624718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerStarted","Data":"1d5b78abbb5e2a59e1b1457349c70f190753858ae936d0ab19bb55cc724af44f"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.625568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:05 crc kubenswrapper[4722]: W0219 19:41:05.644842 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8493c9f_328a_446d_8110_5879a7aedd2b.slice/crio-927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e WatchSource:0}: Error finding container 927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e: Status 404 returned error can't find the container with id 927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651282 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651661 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" containerID="cri-o://89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651713 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" containerID="cri-o://41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651760 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" containerID="cri-o://c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651731 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" containerID="cri-o://88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.694689 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.710798 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.710781115 podStartE2EDuration="2.710781115s" podCreationTimestamp="2026-02-19 19:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:05.66743453 +0000 UTC m=+1365.279784854" watchObservedRunningTime="2026-02-19 19:41:05.710781115 +0000 UTC m=+1365.323131429" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.648377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerStarted","Data":"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.648969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerStarted","Data":"b861b10dd2203a847e9aec1463641c9c77079ce2640617c70a1b02fbcb8c691f"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657782 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099" exitCode=0 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657812 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1" exitCode=2 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657822 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193" exitCode=0 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657829 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a" exitCode=0 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.660368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8493c9f-328a-446d-8110-5879a7aedd2b","Type":"ContainerStarted","Data":"5fb1969d69a84b92a4b54b843e64e8275f19bbb8264232f994396b4209fc340d"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.660425 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.660436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8493c9f-328a-446d-8110-5879a7aedd2b","Type":"ContainerStarted","Data":"927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.674988 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.674970107 podStartE2EDuration="2.674970107s" podCreationTimestamp="2026-02-19 19:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:06.668672141 +0000 UTC m=+1366.281022465" watchObservedRunningTime="2026-02-19 19:41:06.674970107 +0000 UTC m=+1366.287320431" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.696512 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.212092656 podStartE2EDuration="2.69649406s" podCreationTimestamp="2026-02-19 19:41:04 +0000 UTC" firstStartedPulling="2026-02-19 19:41:05.648449867 +0000 UTC m=+1365.260800191" lastFinishedPulling="2026-02-19 19:41:06.132851251 +0000 UTC m=+1365.745201595" observedRunningTime="2026-02-19 19:41:06.684529916 +0000 UTC m=+1366.296880240" watchObservedRunningTime="2026-02-19 19:41:06.69649406 +0000 UTC m=+1366.308844384" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.907952 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958000 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958315 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958338 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.960261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.961765 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.965335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts" (OuterVolumeSpecName: "scripts") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.968627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn" (OuterVolumeSpecName: "kube-api-access-5hgsn") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "kube-api-access-5hgsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.032397 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.054661 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064040 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064066 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064077 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064086 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064096 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.075767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.128058 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data" (OuterVolumeSpecName: "config-data") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.165786 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.166411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.166630 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.166782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.167734 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.167911 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.170853 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b" (OuterVolumeSpecName: "kube-api-access-q6p5b") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "kube-api-access-q6p5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.173270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts" (OuterVolumeSpecName: "scripts") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.195356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.223963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data" (OuterVolumeSpecName: "config-data") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276192 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276224 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276257 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276267 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.673303 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.674380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerDied","Data":"facd2fa08899d3f5f781d5d5581a5b5e4e0814ddf689e925edac6d9674d3eba2"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.674426 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="facd2fa08899d3f5f781d5d5581a5b5e4e0814ddf689e925edac6d9674d3eba2" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679103 4722 generic.go:334] "Generic (PLEG): container finished" podID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerID="e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec" exitCode=0 Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679178 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerDied","Data":"e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerDied","Data":"e752336bd0fae94c244c48636db8246622b4108d65cd7eef8e5be29938427e5c"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679211 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e752336bd0fae94c244c48636db8246622b4108d65cd7eef8e5be29938427e5c" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.686323 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.686133 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.693018 4722 scope.go:117] "RemoveContainer" containerID="c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.743845 4722 scope.go:117] "RemoveContainer" containerID="41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745300 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745796 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745805 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745835 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerName="nova-cell1-conductor-db-sync" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745841 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerName="nova-cell1-conductor-db-sync" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745854 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745859 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745868 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745874 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746039 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746051 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerName="nova-cell1-conductor-db-sync" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746081 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746095 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746900 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.748919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.750542 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.773348 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.783449 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788061 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788803 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmplr\" (UniqueName: \"kubernetes.io/projected/f7880856-0db7-4bbf-9202-04f90868fc1d-kube-api-access-hmplr\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788966 4722 scope.go:117] "RemoveContainer" containerID="88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.795425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs" (OuterVolumeSpecName: "logs") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.820803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5" (OuterVolumeSpecName: "kube-api-access-qc6h5") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "kube-api-access-qc6h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.831975 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.837459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmplr\" (UniqueName: \"kubernetes.io/projected/f7880856-0db7-4bbf-9202-04f90868fc1d-kube-api-access-hmplr\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901721 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901731 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901741 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.905598 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.906364 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906383 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.906427 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906434 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906699 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906733 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.910366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.914511 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.914740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.914948 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.917343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.924799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.926203 4722 scope.go:117] "RemoveContainer" containerID="89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.938735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmplr\" (UniqueName: \"kubernetes.io/projected/f7880856-0db7-4bbf-9202-04f90868fc1d-kube-api-access-hmplr\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.942370 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data" (OuterVolumeSpecName: "config-data") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.987386 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003596 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003880 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.004017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.004037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.004095 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.091631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106436 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.111268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.111586 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.111808 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.113203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.115488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.119425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.124215 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.132733 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.254698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.641353 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.704683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7880856-0db7-4bbf-9202-04f90868fc1d","Type":"ContainerStarted","Data":"bed3c002650ca5de4b056281524f0247c520fc23a59bc8cb6c23476a7afb5560"} Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.704757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.790814 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: W0219 19:41:08.791583 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35309d31_c095_492f_8645_f99a629dafd5.slice/crio-c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121 WatchSource:0}: Error finding container c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121: Status 404 returned error can't find the container with id c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121 Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.829678 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.845215 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.859014 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.862733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.866013 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.882982 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926626 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926682 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.033085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.033225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.044815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.082513 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" path="/var/lib/kubelet/pods/a47ca62a-2546-47ec-80f7-1aa7e739e43e/volumes" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.083182 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" path="/var/lib/kubelet/pods/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c/volumes" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.132172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.132285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.201719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: W0219 19:41:09.701867 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96772512_a8ae_42f5_b8ce_748d1115c4ef.slice/crio-0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7 WatchSource:0}: Error finding container 0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7: Status 404 returned error can't find the container with id 0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7 Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.702471 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.730025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7880856-0db7-4bbf-9202-04f90868fc1d","Type":"ContainerStarted","Data":"b783b9cc92c606a5cb1cba333ad7c569842c37085e96bb067ddeb43d5dffefa1"} Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.739169 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.752050 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346"} Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.752108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121"} Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.757140 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.757118988 podStartE2EDuration="2.757118988s" podCreationTimestamp="2026-02-19 19:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:09.750024667 +0000 UTC m=+1369.362375011" watchObservedRunningTime="2026-02-19 19:41:09.757118988 +0000 UTC m=+1369.369469312" Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.080175 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.762908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.763300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.765943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerStarted","Data":"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.765994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerStarted","Data":"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.766010 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerStarted","Data":"0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.781770 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.781755169 podStartE2EDuration="2.781755169s" podCreationTimestamp="2026-02-19 19:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:10.780403228 +0000 UTC m=+1370.392753552" watchObservedRunningTime="2026-02-19 19:41:10.781755169 +0000 UTC m=+1370.394105493" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.004018 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.006855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.029352 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.124670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.124782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.124839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.226905 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.247309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.341921 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: W0219 19:41:12.922372 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7bcc56_4611_489d_8f1b_2105503393de.slice/crio-9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338 WatchSource:0}: Error finding container 9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338: Status 404 returned error can't find the container with id 9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338 Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.923664 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.128844 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.799652 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823"} Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.800027 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.802861 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a7bcc56-4611-489d-8f1b-2105503393de" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" exitCode=0 Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.802924 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772"} Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.802967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerStarted","Data":"9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338"} Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.824103 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.071988225 podStartE2EDuration="6.824085277s" podCreationTimestamp="2026-02-19 19:41:07 +0000 UTC" firstStartedPulling="2026-02-19 19:41:08.794082341 +0000 UTC m=+1368.406432655" lastFinishedPulling="2026-02-19 19:41:12.546179373 +0000 UTC m=+1372.158529707" observedRunningTime="2026-02-19 19:41:13.823660374 +0000 UTC m=+1373.436010698" watchObservedRunningTime="2026-02-19 19:41:13.824085277 +0000 UTC m=+1373.436435601" Feb 19 19:41:14 crc kubenswrapper[4722]: I0219 19:41:14.131634 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:41:14 crc kubenswrapper[4722]: I0219 19:41:14.131698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:41:14 crc kubenswrapper[4722]: I0219 19:41:14.846581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerStarted","Data":"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6"} Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.082892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.110280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.132830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.140322 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.140404 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.884659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.201885 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.202499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.894074 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a7bcc56-4611-489d-8f1b-2105503393de" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" exitCode=0 Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.894207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6"} Feb 19 19:41:20 crc kubenswrapper[4722]: I0219 19:41:20.285519 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:20 crc kubenswrapper[4722]: I0219 19:41:20.285545 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:21 crc kubenswrapper[4722]: I0219 19:41:21.916074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerStarted","Data":"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0"} Feb 19 19:41:21 crc kubenswrapper[4722]: I0219 19:41:21.944067 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2vsgx" podStartSLOduration=4.403396707 podStartE2EDuration="10.944048657s" podCreationTimestamp="2026-02-19 19:41:11 +0000 UTC" firstStartedPulling="2026-02-19 19:41:13.804969909 +0000 UTC m=+1373.417320233" lastFinishedPulling="2026-02-19 19:41:20.345621819 +0000 UTC m=+1379.957972183" observedRunningTime="2026-02-19 19:41:21.937256135 +0000 UTC m=+1381.549606469" watchObservedRunningTime="2026-02-19 19:41:21.944048657 +0000 UTC m=+1381.556398981" Feb 19 19:41:22 crc kubenswrapper[4722]: I0219 19:41:22.342841 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:22 crc kubenswrapper[4722]: I0219 19:41:22.342892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:23 crc kubenswrapper[4722]: I0219 19:41:23.393856 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2vsgx" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" probeResult="failure" output=< Feb 19 19:41:23 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:41:23 crc kubenswrapper[4722]: > Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.137266 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.140890 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.143743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.958242 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.977771 4722 generic.go:334] "Generic (PLEG): container finished" podID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerID="346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0" exitCode=137 Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.979875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerDied","Data":"346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0"} Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.979938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerDied","Data":"f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29"} Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.979953 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.047702 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.145650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.145731 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.146536 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.153689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98" (OuterVolumeSpecName: "kube-api-access-42f98") pod "b663fb91-fb60-451c-a9c9-7278dbd1c9ac" (UID: "b663fb91-fb60-451c-a9c9-7278dbd1c9ac"). InnerVolumeSpecName "kube-api-access-42f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.180327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data" (OuterVolumeSpecName: "config-data") pod "b663fb91-fb60-451c-a9c9-7278dbd1c9ac" (UID: "b663fb91-fb60-451c-a9c9-7278dbd1c9ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.191253 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b663fb91-fb60-451c-a9c9-7278dbd1c9ac" (UID: "b663fb91-fb60-451c-a9c9-7278dbd1c9ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.248681 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.248712 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.248722 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.988526 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.024133 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.036324 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.049262 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: E0219 19:41:28.049946 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.049972 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.050235 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.051119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.055994 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.056272 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.059823 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.079530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.169281 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vln\" (UniqueName: \"kubernetes.io/projected/168eaa46-c907-452a-8537-3cea6b524360-kube-api-access-h2vln\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.169884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.169939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.170173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.170202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vln\" (UniqueName: \"kubernetes.io/projected/168eaa46-c907-452a-8537-3cea6b524360-kube-api-access-h2vln\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.277674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.277690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.283669 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.284591 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.293227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vln\" (UniqueName: \"kubernetes.io/projected/168eaa46-c907-452a-8537-3cea6b524360-kube-api-access-h2vln\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.380144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.870009 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: W0219 19:41:28.891277 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod168eaa46_c907_452a_8537_3cea6b524360.slice/crio-87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f WatchSource:0}: Error finding container 87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f: Status 404 returned error can't find the container with id 87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.003776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"168eaa46-c907-452a-8537-3cea6b524360","Type":"ContainerStarted","Data":"87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f"} Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.091102 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" path="/var/lib/kubelet/pods/b663fb91-fb60-451c-a9c9-7278dbd1c9ac/volumes" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.206809 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.207580 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.207708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.213906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.016098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"168eaa46-c907-452a-8537-3cea6b524360","Type":"ContainerStarted","Data":"9340639e2e9665570ab0da301dc3a40575632326238c2a65a98c80cedd64dc02"} Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.017459 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.022315 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.063588 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.063565383 podStartE2EDuration="2.063565383s" podCreationTimestamp="2026-02-19 19:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:30.036540199 +0000 UTC m=+1389.648890523" watchObservedRunningTime="2026-02-19 19:41:30.063565383 +0000 UTC m=+1389.675915727" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.271780 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.279196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.290070 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.321617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.321842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322784 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.427910 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.427974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428184 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.454478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.607779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:31 crc kubenswrapper[4722]: I0219 19:41:31.152226 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:41:31 crc kubenswrapper[4722]: W0219 19:41:31.166318 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfcca6fc_5afb_464c_9852_3532ba5878a3.slice/crio-a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4 WatchSource:0}: Error finding container a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4: Status 404 returned error can't find the container with id a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.049263 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerID="2645caf8bc3502647b4c5a4dc4d97510df5ceb77697881dbc41661d5cae80579" exitCode=0 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.049304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerDied","Data":"2645caf8bc3502647b4c5a4dc4d97510df5ceb77697881dbc41661d5cae80579"} Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.049715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerStarted","Data":"a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4"} Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.395101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.463953 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.644531 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.699858 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700251 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" containerID="cri-o://f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700478 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" containerID="cri-o://797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700671 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" containerID="cri-o://9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700830 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" containerID="cri-o://aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.725192 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.221:3000/\": EOF" Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.050196 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062188 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823" exitCode=0 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062223 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331" exitCode=2 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062233 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346" exitCode=0 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.064538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerStarted","Data":"07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.064734 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" containerID="cri-o://3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" gracePeriod=30 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.064770 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" containerID="cri-o://8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" gracePeriod=30 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.103118 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" podStartSLOduration=3.103101773 podStartE2EDuration="3.103101773s" podCreationTimestamp="2026-02-19 19:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:33.100220863 +0000 UTC m=+1392.712571197" watchObservedRunningTime="2026-02-19 19:41:33.103101773 +0000 UTC m=+1392.715452097" Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.380287 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.079852 4722 generic.go:334] "Generic (PLEG): container finished" podID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" exitCode=143 Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.079939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerDied","Data":"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68"} Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.080269 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2vsgx" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" containerID="cri-o://99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" gracePeriod=2 Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.080511 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.630170 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.738177 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"5a7bcc56-4611-489d-8f1b-2105503393de\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.738323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"5a7bcc56-4611-489d-8f1b-2105503393de\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.738385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"5a7bcc56-4611-489d-8f1b-2105503393de\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.744234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx" (OuterVolumeSpecName: "kube-api-access-8rrkx") pod "5a7bcc56-4611-489d-8f1b-2105503393de" (UID: "5a7bcc56-4611-489d-8f1b-2105503393de"). InnerVolumeSpecName "kube-api-access-8rrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.758639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities" (OuterVolumeSpecName: "utilities") pod "5a7bcc56-4611-489d-8f1b-2105503393de" (UID: "5a7bcc56-4611-489d-8f1b-2105503393de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.840279 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.840311 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.880432 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a7bcc56-4611-489d-8f1b-2105503393de" (UID: "5a7bcc56-4611-489d-8f1b-2105503393de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.942723 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.094965 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a7bcc56-4611-489d-8f1b-2105503393de" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" exitCode=0 Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.095081 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.095098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0"} Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.096198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338"} Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.096243 4722 scope.go:117] "RemoveContainer" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.132747 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.138368 4722 scope.go:117] "RemoveContainer" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.150406 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.173277 4722 scope.go:117] "RemoveContainer" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.230877 4722 scope.go:117] "RemoveContainer" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" Feb 19 19:41:35 crc kubenswrapper[4722]: E0219 19:41:35.236561 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0\": container with ID starting with 99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0 not found: ID does not exist" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.236662 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0"} err="failed to get container status \"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0\": rpc error: code = NotFound desc = could not find container \"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0\": container with ID starting with 99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0 not found: ID does not exist" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.236762 4722 scope.go:117] "RemoveContainer" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" Feb 19 19:41:35 crc kubenswrapper[4722]: E0219 19:41:35.239561 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6\": container with ID starting with 2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6 not found: ID does not exist" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.239587 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6"} err="failed to get container status \"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6\": rpc error: code = NotFound desc = could not find container \"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6\": container with ID starting with 2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6 not found: ID does not exist" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.239606 4722 scope.go:117] "RemoveContainer" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" Feb 19 19:41:35 crc kubenswrapper[4722]: E0219 19:41:35.242433 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772\": container with ID starting with 85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772 not found: ID does not exist" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.242458 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772"} err="failed to get container status \"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772\": rpc error: code = NotFound desc = could not find container \"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772\": container with ID starting with 85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772 not found: ID does not exist" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.750614 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.883830 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.883925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.884007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.884071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.884692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs" (OuterVolumeSpecName: "logs") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.893552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk" (OuterVolumeSpecName: "kube-api-access-78gnk") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "kube-api-access-78gnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.952441 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.965700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data" (OuterVolumeSpecName: "config-data") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988840 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988867 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988878 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988888 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.083189 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35309d31_c095_492f_8645_f99a629dafd5.slice/crio-797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.138733 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" path="/var/lib/kubelet/pods/5a7bcc56-4611-489d-8f1b-2105503393de/volumes" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.240425 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443" exitCode=0 Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.240521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443"} Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272493 4722 generic.go:334] "Generic (PLEG): container finished" podID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" exitCode=0 Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerDied","Data":"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea"} Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerDied","Data":"0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7"} Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272601 4722 scope.go:117] "RemoveContainer" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272834 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.366749 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.393235 4722 scope.go:117] "RemoveContainer" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.397411 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.417696 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.418255 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.418276 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.418301 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.418310 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.431572 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-utilities" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.431618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-utilities" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.431662 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.431671 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.431684 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-content" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.431690 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-content" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.432064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.432085 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.432106 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.436327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.436409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443417 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443616 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443805 4722 scope.go:117] "RemoveContainer" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.444546 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea\": container with ID starting with 8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea not found: ID does not exist" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.444586 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea"} err="failed to get container status \"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea\": rpc error: code = NotFound desc = could not find container \"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea\": container with ID starting with 8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea not found: ID does not exist" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.444618 4722 scope.go:117] "RemoveContainer" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.446356 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68\": container with ID starting with 3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68 not found: ID does not exist" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.446382 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68"} err="failed to get container status \"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68\": rpc error: code = NotFound desc = could not find container \"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68\": container with ID starting with 3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68 not found: ID does not exist" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.500996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501257 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.602852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.603145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.604542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.613882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.614796 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.617284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.617421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.621723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.701743 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.754867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.809641 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.809946 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810102 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810490 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.811447 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.811571 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.814321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts" (OuterVolumeSpecName: "scripts") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.815394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq" (OuterVolumeSpecName: "kube-api-access-vfrfq") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "kube-api-access-vfrfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.845452 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.887625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930100 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930128 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930232 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930269 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930279 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.937374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.958419 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data" (OuterVolumeSpecName: "config-data") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.032243 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.032277 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.287808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121"} Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.287871 4722 scope.go:117] "RemoveContainer" containerID="9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.287921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.311113 4722 scope.go:117] "RemoveContainer" containerID="aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.330524 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.380918 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.406200 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.470532 4722 scope.go:117] "RemoveContainer" containerID="797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.523937 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.546290 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.556825 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557348 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557369 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557389 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557397 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557411 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557421 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557445 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557454 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557697 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557713 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557741 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557760 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.560942 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.563466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.566593 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.566843 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.570762 4722 scope.go:117] "RemoveContainer" containerID="f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.571909 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645705 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.748930 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.748978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749512 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.750114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.750353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.754332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.771144 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.771411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.771566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.775008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.779738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.885938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.081170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35309d31-c095-492f-8645-f99a629dafd5" path="/var/lib/kubelet/pods/35309d31-c095-492f-8645-f99a629dafd5/volumes" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.082242 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" path="/var/lib/kubelet/pods/96772512-a8ae-42f5-b8ce-748d1115c4ef/volumes" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.300243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerStarted","Data":"8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8"} Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.300521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerStarted","Data":"7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8"} Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.300533 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerStarted","Data":"169addbbb50b70578d635a167ff6f9ab09ad19713f22931dee2d5c96156a2632"} Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.322295 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.324212 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.32419915 podStartE2EDuration="2.32419915s" podCreationTimestamp="2026-02-19 19:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:39.322544449 +0000 UTC m=+1398.934894773" watchObservedRunningTime="2026-02-19 19:41:39.32419915 +0000 UTC m=+1398.936549494" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.440268 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:39 crc kubenswrapper[4722]: W0219 19:41:39.441844 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6ac8e6_55da_4c9f_b3b3_fc60afc50c37.slice/crio-70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532 WatchSource:0}: Error finding container 70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532: Status 404 returned error can't find the container with id 70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532 Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.489611 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.492246 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.496020 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.496965 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.504605 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565367 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668361 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668410 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.675829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.676084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.676333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.684891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.816862 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.290241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.328354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerStarted","Data":"a029982f23c93dbc7ae7b59f97f09872491299ab2ae9f8e1237d81964fadfc32"} Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.387473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679"} Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.387518 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532"} Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.610339 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.688953 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.689890 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" containerID="cri-o://42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" gracePeriod=10 Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.276319 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409498 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409564 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409765 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409802 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.416612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418127 4722 generic.go:334] "Generic (PLEG): container finished" podID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" exitCode=0 Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerDied","Data":"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerDied","Data":"665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418228 4722 scope.go:117] "RemoveContainer" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418361 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.423345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm" (OuterVolumeSpecName: "kube-api-access-5dvwm") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "kube-api-access-5dvwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.423680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerStarted","Data":"1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.446657 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2lhsl" podStartSLOduration=2.446639669 podStartE2EDuration="2.446639669s" podCreationTimestamp="2026-02-19 19:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:41.441658603 +0000 UTC m=+1401.054008937" watchObservedRunningTime="2026-02-19 19:41:41.446639669 +0000 UTC m=+1401.058989993" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.519427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.520904 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config" (OuterVolumeSpecName: "config") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.527575 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.542950 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.544708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.582613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630008 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630050 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630062 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630075 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630086 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.632626 4722 scope.go:117] "RemoveContainer" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.669216 4722 scope.go:117] "RemoveContainer" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" Feb 19 19:41:41 crc kubenswrapper[4722]: E0219 19:41:41.670024 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e\": container with ID starting with 42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e not found: ID does not exist" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.670078 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e"} err="failed to get container status \"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e\": rpc error: code = NotFound desc = could not find container \"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e\": container with ID starting with 42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e not found: ID does not exist" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.670105 4722 scope.go:117] "RemoveContainer" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" Feb 19 19:41:41 crc kubenswrapper[4722]: E0219 19:41:41.670808 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b\": container with ID starting with 3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b not found: ID does not exist" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.670867 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b"} err="failed to get container status \"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b\": rpc error: code = NotFound desc = could not find container \"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b\": container with ID starting with 3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b not found: ID does not exist" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.767416 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.786430 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.798435 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.798486 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:41:42 crc kubenswrapper[4722]: I0219 19:41:42.437790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f"} Feb 19 19:41:43 crc kubenswrapper[4722]: I0219 19:41:43.087107 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" path="/var/lib/kubelet/pods/5e629ce1-0108-4450-bb62-44ca1d2993b6/volumes" Feb 19 19:41:44 crc kubenswrapper[4722]: I0219 19:41:44.458767 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4"} Feb 19 19:41:44 crc kubenswrapper[4722]: I0219 19:41:44.459508 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:41:44 crc kubenswrapper[4722]: I0219 19:41:44.493280 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8494755619999999 podStartE2EDuration="6.49325827s" podCreationTimestamp="2026-02-19 19:41:38 +0000 UTC" firstStartedPulling="2026-02-19 19:41:39.443989113 +0000 UTC m=+1399.056339447" lastFinishedPulling="2026-02-19 19:41:44.087771831 +0000 UTC m=+1403.700122155" observedRunningTime="2026-02-19 19:41:44.480348426 +0000 UTC m=+1404.092698750" watchObservedRunningTime="2026-02-19 19:41:44.49325827 +0000 UTC m=+1404.105608594" Feb 19 19:41:46 crc kubenswrapper[4722]: I0219 19:41:46.481189 4722 generic.go:334] "Generic (PLEG): container finished" podID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerID="1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63" exitCode=0 Feb 19 19:41:46 crc kubenswrapper[4722]: I0219 19:41:46.481384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerDied","Data":"1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63"} Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.755415 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.755928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.917786 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.971116 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2" (OuterVolumeSpecName: "kube-api-access-4cqv2") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "kube-api-access-4cqv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.974261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts" (OuterVolumeSpecName: "scripts") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.998433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data" (OuterVolumeSpecName: "config-data") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.023317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067618 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067653 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067665 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067673 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.544677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerDied","Data":"a029982f23c93dbc7ae7b59f97f09872491299ab2ae9f8e1237d81964fadfc32"} Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.544935 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a029982f23c93dbc7ae7b59f97f09872491299ab2ae9f8e1237d81964fadfc32" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.544983 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.693299 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.693600 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" containerID="cri-o://fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.726352 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.726597 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" containerID="cri-o://7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.726670 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" containerID="cri-o://8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.754308 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": EOF" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.758307 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.770093 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.770406 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" containerID="cri-o://9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.770754 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" containerID="cri-o://a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346" gracePeriod=30 Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.555295 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerID="7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8" exitCode=143 Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.555368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerDied","Data":"7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8"} Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.557692 4722 generic.go:334] "Generic (PLEG): container finished" podID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerID="9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2" exitCode=143 Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.557736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerDied","Data":"9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2"} Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.080717 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.081123 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.081326 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.081352 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.210919 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.314962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.315030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.315100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.321103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24" (OuterVolumeSpecName: "kube-api-access-qjj24") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9"). InnerVolumeSpecName "kube-api-access-qjj24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.360615 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data podName:339423c2-068b-48f8-8117-04f6a37ceaf9 nodeName:}" failed. No retries permitted until 2026-02-19 19:41:50.860584784 +0000 UTC m=+1410.472935098 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9") : error deleting /var/lib/kubelet/pods/339423c2-068b-48f8-8117-04f6a37ceaf9/volume-subpaths: remove /var/lib/kubelet/pods/339423c2-068b-48f8-8117-04f6a37ceaf9/volume-subpaths: no such file or directory Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.365322 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.417208 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.417242 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567543 4722 generic.go:334] "Generic (PLEG): container finished" podID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" exitCode=0 Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerDied","Data":"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40"} Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerDied","Data":"b861b10dd2203a847e9aec1463641c9c77079ce2640617c70a1b02fbcb8c691f"} Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567614 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567625 4722 scope.go:117] "RemoveContainer" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.588798 4722 scope.go:117] "RemoveContainer" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.589312 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40\": container with ID starting with fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 not found: ID does not exist" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.589367 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40"} err="failed to get container status \"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40\": rpc error: code = NotFound desc = could not find container \"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40\": container with ID starting with fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 not found: ID does not exist" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.929021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.933345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data" (OuterVolumeSpecName: "config-data") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.031595 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.200884 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.223474 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242303 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242788 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242808 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242829 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="init" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242837 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="init" Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242850 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242876 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerName="nova-manage" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242886 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerName="nova-manage" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243112 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243124 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243132 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerName="nova-manage" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243959 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.250378 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.259291 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.352135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-config-data\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.352511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.352671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcdrh\" (UniqueName: \"kubernetes.io/projected/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-kube-api-access-bcdrh\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.454797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-config-data\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.455269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.455830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcdrh\" (UniqueName: \"kubernetes.io/projected/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-kube-api-access-bcdrh\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.460081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.462907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-config-data\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.476332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcdrh\" (UniqueName: \"kubernetes.io/projected/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-kube-api-access-bcdrh\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.566827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.912180 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": read tcp 10.217.0.2:33674->10.217.0.217:8775: read: connection reset by peer" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.912497 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": read tcp 10.217.0.2:33684->10.217.0.217:8775: read: connection reset by peer" Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.006328 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:52 crc kubenswrapper[4722]: W0219 19:41:52.011497 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f1f5c9a_dacb_45b5_95bf_2e62a12a908b.slice/crio-5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7 WatchSource:0}: Error finding container 5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7: Status 404 returned error can't find the container with id 5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7 Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.594897 4722 generic.go:334] "Generic (PLEG): container finished" podID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerID="a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346" exitCode=0 Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.594998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerDied","Data":"a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346"} Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.597664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b","Type":"ContainerStarted","Data":"634cc860abf2b738b05ad29a3d1f796b3e6b10bd722e31641e2a789979bc2e2e"} Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.597710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b","Type":"ContainerStarted","Data":"5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7"} Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.618722 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.618696181 podStartE2EDuration="1.618696181s" podCreationTimestamp="2026-02-19 19:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:52.613886941 +0000 UTC m=+1412.226237275" watchObservedRunningTime="2026-02-19 19:41:52.618696181 +0000 UTC m=+1412.231046505" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.088987 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" path="/var/lib/kubelet/pods/339423c2-068b-48f8-8117-04f6a37ceaf9/volumes" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.127473 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290170 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290347 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290425 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.291770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs" (OuterVolumeSpecName: "logs") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.296060 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx" (OuterVolumeSpecName: "kube-api-access-z9cmx") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "kube-api-access-z9cmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.334308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.339297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data" (OuterVolumeSpecName: "config-data") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.362408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394383 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394426 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394443 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394456 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394469 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.628823 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.634493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerDied","Data":"1d5b78abbb5e2a59e1b1457349c70f190753858ae936d0ab19bb55cc724af44f"} Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.634864 4722 scope.go:117] "RemoveContainer" containerID="a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.683211 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.704884 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: E0219 19:41:53.726734 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726753 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" Feb 19 19:41:53 crc kubenswrapper[4722]: E0219 19:41:53.726769 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726776 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726971 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726994 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.728064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.731195 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.743702 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.748011 4722 scope.go:117] "RemoveContainer" containerID="9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.765545 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.803883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmhf\" (UniqueName: \"kubernetes.io/projected/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-kube-api-access-hmmhf\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.803929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-config-data\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.804031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-logs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.804111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.804131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.905699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.905944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.906071 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmhf\" (UniqueName: \"kubernetes.io/projected/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-kube-api-access-hmmhf\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.906490 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-config-data\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.907016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-logs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.907539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-logs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.917373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.917715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-config-data\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.930823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.944675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmhf\" (UniqueName: \"kubernetes.io/projected/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-kube-api-access-hmmhf\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.083348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.584678 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.638906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f2a647c-7a68-4e2c-aabf-b18973b20ad0","Type":"ContainerStarted","Data":"b195759e300cb1e12ca80f40c8381b49a1330fb39c6de4f6257f2bf3956e25e5"} Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.642105 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerID="8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8" exitCode=0 Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.642300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerDied","Data":"8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.089602 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" path="/var/lib/kubelet/pods/3f9140da-76d7-4109-9892-23c1ceb60eaa/volumes" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.458179 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.646477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647107 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647317 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs" (OuterVolumeSpecName: "logs") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.648204 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.650816 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc" (OuterVolumeSpecName: "kube-api-access-8v5gc") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "kube-api-access-8v5gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.657109 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerDied","Data":"169addbbb50b70578d635a167ff6f9ab09ad19713f22931dee2d5c96156a2632"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.657182 4722 scope.go:117] "RemoveContainer" containerID="8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.657369 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.659498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f2a647c-7a68-4e2c-aabf-b18973b20ad0","Type":"ContainerStarted","Data":"8fbd7fe758e7bb8bd2bbeba5443112a38e88a0b29c952089007fe29f8e022d9d"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.659581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f2a647c-7a68-4e2c-aabf-b18973b20ad0","Type":"ContainerStarted","Data":"09c2109fd234ad7bb2ce0190198df4bda0db494ce8d35f3e782b36fe08e9d5b9"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.679762 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data" (OuterVolumeSpecName: "config-data") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.691738 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.691719117 podStartE2EDuration="2.691719117s" podCreationTimestamp="2026-02-19 19:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:55.68028371 +0000 UTC m=+1415.292634044" watchObservedRunningTime="2026-02-19 19:41:55.691719117 +0000 UTC m=+1415.304069441" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.692944 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.693395 4722 scope.go:117] "RemoveContainer" containerID="7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.717885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.728399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750344 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750376 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750389 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750399 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750409 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.993303 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.008975 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.024912 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: E0219 19:41:56.025396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025420 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" Feb 19 19:41:56 crc kubenswrapper[4722]: E0219 19:41:56.025440 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025448 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025696 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025739 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.034609 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.037967 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.038278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.038481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.040964 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzvt\" (UniqueName: \"kubernetes.io/projected/5aaacc6a-6882-467d-b66f-0178ccd35955-kube-api-access-fgzvt\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-public-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-config-data\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aaacc6a-6882-467d-b66f-0178ccd35955-logs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.159074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzvt\" (UniqueName: \"kubernetes.io/projected/5aaacc6a-6882-467d-b66f-0178ccd35955-kube-api-access-fgzvt\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-public-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-config-data\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aaacc6a-6882-467d-b66f-0178ccd35955-logs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.161328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aaacc6a-6882-467d-b66f-0178ccd35955-logs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.164850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.167833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.168213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-public-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.173189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-config-data\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.177800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzvt\" (UniqueName: \"kubernetes.io/projected/5aaacc6a-6882-467d-b66f-0178ccd35955-kube-api-access-fgzvt\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.361083 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.567060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.857766 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.082895 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" path="/var/lib/kubelet/pods/7cd6b11a-72fb-4116-8ccf-aee449ab564a/volumes" Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.685968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5aaacc6a-6882-467d-b66f-0178ccd35955","Type":"ContainerStarted","Data":"448fb2217cc376c1b0880710bf5dcf49e19a7dc4a1e40b77bafda98677e76a38"} Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.686283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5aaacc6a-6882-467d-b66f-0178ccd35955","Type":"ContainerStarted","Data":"8a415958535c5343d60761a608a136b6e4e7fac82bf7cd448cdb9af0e6aeb3e9"} Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.686295 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5aaacc6a-6882-467d-b66f-0178ccd35955","Type":"ContainerStarted","Data":"800e57beaef9bb1445abc3a44c898eaed106d2e06403c3ee50dc90d556b51a15"} Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.716190 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.716174223 podStartE2EDuration="2.716174223s" podCreationTimestamp="2026-02-19 19:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:57.714950916 +0000 UTC m=+1417.327301240" watchObservedRunningTime="2026-02-19 19:41:57.716174223 +0000 UTC m=+1417.328524547" Feb 19 19:41:59 crc kubenswrapper[4722]: I0219 19:41:59.086885 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:41:59 crc kubenswrapper[4722]: I0219 19:41:59.086963 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:42:01 crc kubenswrapper[4722]: I0219 19:42:01.567244 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 19:42:01 crc kubenswrapper[4722]: I0219 19:42:01.595750 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 19:42:01 crc kubenswrapper[4722]: I0219 19:42:01.784555 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 19:42:04 crc kubenswrapper[4722]: I0219 19:42:04.084340 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:42:04 crc kubenswrapper[4722]: I0219 19:42:04.084691 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:42:05 crc kubenswrapper[4722]: I0219 19:42:05.096355 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5f2a647c-7a68-4e2c-aabf-b18973b20ad0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:05 crc kubenswrapper[4722]: I0219 19:42:05.096430 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5f2a647c-7a68-4e2c-aabf-b18973b20ad0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:06 crc kubenswrapper[4722]: I0219 19:42:06.362059 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:42:06 crc kubenswrapper[4722]: I0219 19:42:06.362123 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:42:07 crc kubenswrapper[4722]: I0219 19:42:07.374410 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5aaacc6a-6882-467d-b66f-0178ccd35955" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:07 crc kubenswrapper[4722]: I0219 19:42:07.374438 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5aaacc6a-6882-467d-b66f-0178ccd35955" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:08 crc kubenswrapper[4722]: I0219 19:42:08.897907 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 19:42:11 crc kubenswrapper[4722]: I0219 19:42:11.798870 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:42:11 crc kubenswrapper[4722]: I0219 19:42:11.799140 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.088272 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.088892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.094076 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.883216 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.370118 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.371061 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.372780 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.391003 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.895424 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.905654 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.148477 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.159136 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.236778 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.238356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.240199 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.249563 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.394978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395332 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.496927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497220 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.502952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.503001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.503670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.504438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.518781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.559693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.083819 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" path="/var/lib/kubelet/pods/fb399ce1-7269-4d99-9140-0d1d33a6fd6a/volumes" Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.183218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.822844 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823425 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" containerID="cri-o://28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" gracePeriod=30 Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823533 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" containerID="cri-o://4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" gracePeriod=30 Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823503 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" containerID="cri-o://4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" gracePeriod=30 Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823546 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" containerID="cri-o://cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" gracePeriod=30 Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.004998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerStarted","Data":"b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08"} Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.005039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerStarted","Data":"b3b56e96586072af8aee5e36c8322fcbbe0a4be38b365b37d145b3ead8b232b8"} Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.008384 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" exitCode=2 Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.008440 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f"} Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.028430 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-rwjf7" podStartSLOduration=1.855650898 podStartE2EDuration="2.028405115s" podCreationTimestamp="2026-02-19 19:42:26 +0000 UTC" firstStartedPulling="2026-02-19 19:42:27.19410131 +0000 UTC m=+1446.806451634" lastFinishedPulling="2026-02-19 19:42:27.366855527 +0000 UTC m=+1446.979205851" observedRunningTime="2026-02-19 19:42:28.019561478 +0000 UTC m=+1447.631911802" watchObservedRunningTime="2026-02-19 19:42:28.028405115 +0000 UTC m=+1447.640755439" Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.565707 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.631393 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020208 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" exitCode=0 Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020241 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" exitCode=0 Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4"} Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679"} Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.883626 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906841 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906941 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906974 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907014 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907041 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907173 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.920163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts" (OuterVolumeSpecName: "scripts") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.928190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq" (OuterVolumeSpecName: "kube-api-access-sz5fq") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "kube-api-access-sz5fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.002372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009073 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009721 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009795 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009856 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009912 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045278 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data" (OuterVolumeSpecName: "config-data") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045405 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045313 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" exitCode=0 Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d"} Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045750 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532"} Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045829 4722 scope.go:117] "RemoveContainer" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.055745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.094360 4722 scope.go:117] "RemoveContainer" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.109443 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.111904 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.111930 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.111939 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.125856 4722 scope.go:117] "RemoveContainer" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.163429 4722 scope.go:117] "RemoveContainer" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.187360 4722 scope.go:117] "RemoveContainer" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.189635 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4\": container with ID starting with 4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4 not found: ID does not exist" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.189690 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4"} err="failed to get container status \"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4\": rpc error: code = NotFound desc = could not find container \"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4\": container with ID starting with 4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4 not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.189716 4722 scope.go:117] "RemoveContainer" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.190007 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f\": container with ID starting with 4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f not found: ID does not exist" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.190032 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f"} err="failed to get container status \"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f\": rpc error: code = NotFound desc = could not find container \"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f\": container with ID starting with 4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.190044 4722 scope.go:117] "RemoveContainer" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.192172 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d\": container with ID starting with cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d not found: ID does not exist" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.192225 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d"} err="failed to get container status \"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d\": rpc error: code = NotFound desc = could not find container \"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d\": container with ID starting with cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.192261 4722 scope.go:117] "RemoveContainer" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.192768 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679\": container with ID starting with 28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679 not found: ID does not exist" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.192797 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679"} err="failed to get container status \"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679\": rpc error: code = NotFound desc = could not find container \"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679\": container with ID starting with 28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679 not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.432142 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.441126 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.483436 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.483984 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484045 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.484110 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484177 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.484265 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484317 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.484370 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484421 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484640 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484709 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484764 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484819 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.515180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8658k\" (UniqueName: \"kubernetes.io/projected/1e7133a0-5642-4b7b-a560-d215b7fd75cd-kube-api-access-8658k\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521779 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-config-data\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-scripts\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.522571 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.532764 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.538468 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.547599 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-config-data\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-scripts\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.623100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8658k\" (UniqueName: \"kubernetes.io/projected/1e7133a0-5642-4b7b-a560-d215b7fd75cd-kube-api-access-8658k\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.623278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.623386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.627351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-config-data\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.628856 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.629544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.630166 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.630222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.634678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-scripts\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.640843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.675239 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8658k\" (UniqueName: \"kubernetes.io/projected/1e7133a0-5642-4b7b-a560-d215b7fd75cd-kube-api-access-8658k\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.876418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.077478 4722 generic.go:334] "Generic (PLEG): container finished" podID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerID="b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08" exitCode=0 Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.087100 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" path="/var/lib/kubelet/pods/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37/volumes" Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.088003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerDied","Data":"b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08"} Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.406268 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:32 crc kubenswrapper[4722]: I0219 19:42:32.453723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"31a8f7404f8354090b9f106b04a82029d92d5b34c33d3d24f9b8b2de1a682a99"} Feb 19 19:42:32 crc kubenswrapper[4722]: I0219 19:42:32.987427 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155494 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155571 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.166479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6" (OuterVolumeSpecName: "kube-api-access-vpfg6") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "kube-api-access-vpfg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.184825 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs" (OuterVolumeSpecName: "certs") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.189356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts" (OuterVolumeSpecName: "scripts") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.197070 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data" (OuterVolumeSpecName: "config-data") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.210435 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.250296 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.259575 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260023 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260065 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260077 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260086 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.266958 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.364216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:42:33 crc kubenswrapper[4722]: E0219 19:42:33.364612 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerName="cloudkitty-db-sync" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.364628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerName="cloudkitty-db-sync" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.364816 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerName="cloudkitty-db-sync" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.365752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.407212 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.474334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerDied","Data":"b3b56e96586072af8aee5e36c8322fcbbe0a4be38b365b37d145b3ead8b232b8"} Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.474371 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b56e96586072af8aee5e36c8322fcbbe0a4be38b365b37d145b3ead8b232b8" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.474428 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567593 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669839 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669858 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.676749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.677087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.684347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.686113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.697825 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.983109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.103503 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" containerID="cri-o://fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285" gracePeriod=604795 Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.129578 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.196115 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" containerID="cri-o://37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4" gracePeriod=604795 Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.768724 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 19 19:42:35 crc kubenswrapper[4722]: I0219 19:42:35.086513 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" path="/var/lib/kubelet/pods/1725704f-c153-4de4-9246-87c6a5e878ea/volumes" Feb 19 19:42:36 crc kubenswrapper[4722]: I0219 19:42:36.262695 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:42:36 crc kubenswrapper[4722]: I0219 19:42:36.514600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"a004167d31b52f8c806e89b20758fe28662af134f7e6e33b89fe228caeb98f77"} Feb 19 19:42:36 crc kubenswrapper[4722]: I0219 19:42:36.528076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerStarted","Data":"362971b9a3d43c23fcf4d469e680dc4a2402d33c6f74122b268ea70290fba5b3"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.542260 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"6a5cb7d2117f83b267ee6ac4e148de6215240af48d16642017360e18757f8212"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.542745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"48de6046a4b3a2b298a2baa908f92364157f6b322e36c252526912b33b7a56d1"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.543960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerStarted","Data":"6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.586975 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-77rmn" podStartSLOduration=4.586954189 podStartE2EDuration="4.586954189s" podCreationTimestamp="2026-02-19 19:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:37.563833756 +0000 UTC m=+1457.176184080" watchObservedRunningTime="2026-02-19 19:42:37.586954189 +0000 UTC m=+1457.199304523" Feb 19 19:42:38 crc kubenswrapper[4722]: I0219 19:42:38.555164 4722 generic.go:334] "Generic (PLEG): container finished" podID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerID="6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb" exitCode=0 Feb 19 19:42:38 crc kubenswrapper[4722]: I0219 19:42:38.555267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerDied","Data":"6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb"} Feb 19 19:42:39 crc kubenswrapper[4722]: I0219 19:42:39.575412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"93569e7cbcdc63a223dc3937f2e787df75351ff24e9699ce9ca7c60fe16bb23b"} Feb 19 19:42:39 crc kubenswrapper[4722]: I0219 19:42:39.598859 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.197036399 podStartE2EDuration="9.598840793s" podCreationTimestamp="2026-02-19 19:42:30 +0000 UTC" firstStartedPulling="2026-02-19 19:42:31.410205187 +0000 UTC m=+1451.022555511" lastFinishedPulling="2026-02-19 19:42:38.812009581 +0000 UTC m=+1458.424359905" observedRunningTime="2026-02-19 19:42:39.59842743 +0000 UTC m=+1459.210777764" watchObservedRunningTime="2026-02-19 19:42:39.598840793 +0000 UTC m=+1459.211191107" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.174890 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.329986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330215 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330350 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330406 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.337294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts" (OuterVolumeSpecName: "scripts") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.345311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv" (OuterVolumeSpecName: "kube-api-access-55sdv") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "kube-api-access-55sdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.345542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs" (OuterVolumeSpecName: "certs") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.368450 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.371674 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data" (OuterVolumeSpecName: "config-data") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432763 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432822 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432848 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432869 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432892 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.584726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.584725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerDied","Data":"362971b9a3d43c23fcf4d469e680dc4a2402d33c6f74122b268ea70290fba5b3"} Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.584849 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="362971b9a3d43c23fcf4d469e680dc4a2402d33c6f74122b268ea70290fba5b3" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.587060 4722 generic.go:334] "Generic (PLEG): container finished" podID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerID="fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285" exitCode=0 Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.587131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerDied","Data":"fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285"} Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.592333 4722 generic.go:334] "Generic (PLEG): container finished" podID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerID="37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4" exitCode=0 Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.592399 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerDied","Data":"37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4"} Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.592657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.720197 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842141 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842316 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.844994 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.845170 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.847211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.853510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info" (OuterVolumeSpecName: "pod-info") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.856141 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.857188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.858411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r" (OuterVolumeSpecName: "kube-api-access-56t8r") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "kube-api-access-56t8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.885234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda" (OuterVolumeSpecName: "persistence") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.887308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data" (OuterVolumeSpecName: "config-data") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.919452 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.926356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf" (OuterVolumeSpecName: "server-conf") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946878 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946912 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946926 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946954 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") on node \"crc\" " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946969 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946981 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946993 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.947005 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.947016 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.947025 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.002221 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.020961 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.021649 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda") on node "crc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.047870 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.047930 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048119 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048161 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.049202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050973 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051709 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051732 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051746 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051762 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.053504 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.058012 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.059425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh" (OuterVolumeSpecName: "kube-api-access-k5nxh") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "kube-api-access-k5nxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.067677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.069281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info" (OuterVolumeSpecName: "pod-info") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.137534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51" (OuterVolumeSpecName: "persistence") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "pvc-a5fb8482-d574-4930-864c-175c2bedef51". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156073 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156123 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") on node \"crc\" " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156137 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156169 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156179 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156188 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.165276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf" (OuterVolumeSpecName: "server-conf") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.183338 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data" (OuterVolumeSpecName: "config-data") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.258854 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.258995 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.261412 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.262286 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a5fb8482-d574-4930-864c-175c2bedef51" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51") on node "crc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.341838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.361322 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.361359 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.363034 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.363286 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" containerID="cri-o://20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e" gracePeriod=30 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.389592 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.389861 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" containerID="cri-o://35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" gracePeriod=30 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.389964 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" containerID="cri-o://fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" gracePeriod=30 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.610805 4722 generic.go:334] "Generic (PLEG): container finished" podID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" exitCode=143 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.610878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerDied","Data":"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422"} Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.614527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerDied","Data":"24279a6d2caf7ad4b1f181fa89124ed3ff752cfc1180df75df7a96c88d0345e2"} Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.614582 4722 scope.go:117] "RemoveContainer" containerID="fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.614553 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.627415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.627465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerDied","Data":"3a2845abf856d9cafaeec46534beacb5f3f1990d5bed57b69cf295f8fe01e4f1"} Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.655426 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.670622 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.707369 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.707922 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.707948 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.707966 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.707974 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.708004 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708012 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.708033 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerName="cloudkitty-storageinit" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708043 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerName="cloudkitty-storageinit" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.708057 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708064 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708448 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708484 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerName="cloudkitty-storageinit" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708499 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.716962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.717212 4722 scope.go:117] "RemoveContainer" containerID="c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cbm8q" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721392 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721584 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.747218 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.751852 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.752042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769272 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769369 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769435 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f14785b-2e99-4110-9523-78ec32490e71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vflz\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-kube-api-access-9vflz\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f14785b-2e99-4110-9523-78ec32490e71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769549 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769577 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.796871 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.798800 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.798844 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.798884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.802668 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.802789 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17" gracePeriod=600 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.839758 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.863010 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.865276 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.869060 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.869516 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.869774 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.870055 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qdf2m" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f14785b-2e99-4110-9523-78ec32490e71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f14785b-2e99-4110-9523-78ec32490e71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vflz\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-kube-api-access-9vflz\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872477 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.873140 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.873814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.874425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.877687 4722 scope.go:117] "RemoveContainer" containerID="37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.877920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f14785b-2e99-4110-9523-78ec32490e71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.878132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f14785b-2e99-4110-9523-78ec32490e71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.879297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.879867 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.882875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.896621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.902750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.903710 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.903750 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5f6cee635ca5e2d348cf915d62a0dac8d2194b66bba55200fe901088eac3f7dd/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.921211 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vflz\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-kube-api-access-9vflz\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.958550 4722 scope.go:117] "RemoveContainer" containerID="e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973753 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.974002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jm4f\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-kube-api-access-8jm4f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.974084 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.016721 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084426 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jm4f\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-kube-api-access-8jm4f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.088585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.089719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.092929 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.092960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.093961 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.093972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.095866 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.096068 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.100596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.112865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jm4f\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-kube-api-access-8jm4f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.113757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.224878 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.224926 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6408a1f41ebba08884844654cc07aafa4a02aa7486293e45dd19f823f7662d43/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.328700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.516696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.664784 4722 generic.go:334] "Generic (PLEG): container finished" podID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerID="20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e" exitCode=0 Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.665238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerDied","Data":"20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e"} Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675435 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17" exitCode=0 Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675513 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17"} Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc"} Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675566 4722 scope.go:117] "RemoveContainer" containerID="3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.803034 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.963902 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.060563 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.085211 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" path="/var/lib/kubelet/pods/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45/volumes" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.086164 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" path="/var/lib/kubelet/pods/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f/volumes" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113284 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113525 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113589 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113616 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.127516 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts" (OuterVolumeSpecName: "scripts") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.141358 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs" (OuterVolumeSpecName: "certs") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.142431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.146582 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245" (OuterVolumeSpecName: "kube-api-access-zw245") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "kube-api-access-zw245". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.176505 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data" (OuterVolumeSpecName: "config-data") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.182401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219814 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219887 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219901 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219913 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219924 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219934 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.337813 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.338502 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.338518 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.338739 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.339810 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.342007 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.356829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529062 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529287 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529346 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.594613 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.633182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.633530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.634948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635335 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.636408 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.634366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.637222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.634441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.638005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.638759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.667930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.677744 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.694302 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.694310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerDied","Data":"fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.694355 4722 scope.go:117] "RemoveContainer" containerID="20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.722923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerStarted","Data":"bd0dba8d2d2388592c88639ec82f2ae2c7233392a83f9be2d726673271b2ec52"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727641 4722 generic.go:334] "Generic (PLEG): container finished" podID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" exitCode=0 Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727713 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerDied","Data":"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerDied","Data":"42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727808 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.743904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.743987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744218 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744356 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.750872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerStarted","Data":"364ac221990d25f433c84726f172e8dd3fb628f012b85556817735eb2044da9f"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.752519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs" (OuterVolumeSpecName: "logs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.762957 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.763656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs" (OuterVolumeSpecName: "certs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.764109 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h" (OuterVolumeSpecName: "kube-api-access-br27h") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "kube-api-access-br27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.764889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.775027 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.783724 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.783806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts" (OuterVolumeSpecName: "scripts") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.784124 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784142 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.784162 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784169 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784422 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784451 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.785398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.796914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.799167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.808858 4722 scope.go:117] "RemoveContainer" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846598 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846635 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846644 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846654 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846665 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.861926 4722 scope.go:117] "RemoveContainer" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.908519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data" (OuterVolumeSpecName: "config-data") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.922419 4722 scope.go:117] "RemoveContainer" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.923911 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971\": container with ID starting with fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971 not found: ID does not exist" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.923950 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971"} err="failed to get container status \"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971\": rpc error: code = NotFound desc = could not find container \"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971\": container with ID starting with fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971 not found: ID does not exist" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.923977 4722 scope.go:117] "RemoveContainer" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.924341 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422\": container with ID starting with 35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422 not found: ID does not exist" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.924374 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422"} err="failed to get container status \"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422\": rpc error: code = NotFound desc = could not find container \"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422\": container with ID starting with 35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422 not found: ID does not exist" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948560 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-certs\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948726 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzb89\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-kube-api-access-dzb89\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.949188 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.949225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.949531 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051439 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-certs\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzb89\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-kube-api-access-dzb89\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.055309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.055518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.057311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-certs\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.057760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.113325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.113857 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzb89\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-kube-api-access-dzb89\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.147178 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.201244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.260636 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.333561 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:42:44 crc kubenswrapper[4722]: W0219 19:42:44.500706 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a987597_e2e2_431d_9583_01f4dc2f4ecf.slice/crio-4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd WatchSource:0}: Error finding container 4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd: Status 404 returned error can't find the container with id 4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.711795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.772052 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.779912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerStarted","Data":"4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd"} Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.867312 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.943113 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.981628 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.082645 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" path="/var/lib/kubelet/pods/00bbae7e-ebc6-4102-9398-fc131546bbf5/volumes" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.341469 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.365002 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.376341 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.378664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.382807 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.382906 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.384293 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.390362 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckks6\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-kube-api-access-ckks6\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc650d44-069f-41ed-b944-f1168dd5b25c-logs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.498048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.498295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-scripts\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.498371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-scripts\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckks6\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-kube-api-access-ckks6\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc650d44-069f-41ed-b944-f1168dd5b25c-logs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.608268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.608356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.610320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-scripts\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.610657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.610985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc650d44-069f-41ed-b944-f1168dd5b25c-logs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.612631 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.612994 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.613788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.629240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckks6\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-kube-api-access-ckks6\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.694965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.793638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerStarted","Data":"b95673087d8ea4b9a6a852d0c1a317e33ef78571ef0754777ff9655eec8f3615"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.797160 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f","Type":"ContainerStarted","Data":"2d1729315347d24f6c9daec53fbcd531b838e35e89ebdbe59edac0a29ffea465"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.797203 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f","Type":"ContainerStarted","Data":"f7c0dc467d58658d6bd1e1d8711abeeba50228c0e752ea74443cd5f54500974d"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.800810 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" exitCode=0 Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.801056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerDied","Data":"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.831223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerStarted","Data":"62d771eb1e8a20f3816db1e78f60944ccbfed3ae437139348353bf9a91656d8f"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.916944 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.713768313 podStartE2EDuration="2.916920049s" podCreationTimestamp="2026-02-19 19:42:43 +0000 UTC" firstStartedPulling="2026-02-19 19:42:44.925192867 +0000 UTC m=+1464.537543191" lastFinishedPulling="2026-02-19 19:42:45.128344603 +0000 UTC m=+1464.740694927" observedRunningTime="2026-02-19 19:42:45.882484754 +0000 UTC m=+1465.494835078" watchObservedRunningTime="2026-02-19 19:42:45.916920049 +0000 UTC m=+1465.529270373" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.258650 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fc650d44-069f-41ed-b944-f1168dd5b25c","Type":"ContainerStarted","Data":"1605922eead9cb45e86e6bd7dfadb78e8a868ca729ce40782cda5360ddc6cb27"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fc650d44-069f-41ed-b944-f1168dd5b25c","Type":"ContainerStarted","Data":"b95b47dca55bf6a59f7e571ae6a728815e128a589b41f6a4b9724ce08d5b5bfa"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fc650d44-069f-41ed-b944-f1168dd5b25c","Type":"ContainerStarted","Data":"02aa0de2a5af9fdac494a793daf4a1810962c03e7c0554b5cacff7010d8b844d"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842629 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.844652 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerStarted","Data":"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.845430 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.876697 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=1.876682264 podStartE2EDuration="1.876682264s" podCreationTimestamp="2026-02-19 19:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:46.869358265 +0000 UTC m=+1466.481708589" watchObservedRunningTime="2026-02-19 19:42:46.876682264 +0000 UTC m=+1466.489032588" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.925294 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" podStartSLOduration=3.925273502 podStartE2EDuration="3.925273502s" podCreationTimestamp="2026-02-19 19:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:46.91336938 +0000 UTC m=+1466.525719704" watchObservedRunningTime="2026-02-19 19:42:46.925273502 +0000 UTC m=+1466.537623826" Feb 19 19:42:47 crc kubenswrapper[4722]: I0219 19:42:47.083970 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" path="/var/lib/kubelet/pods/57386acb-6299-4fd3-80a2-25d8769dcc93/volumes" Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.680295 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.748846 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.749497 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" containerID="cri-o://07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb" gracePeriod=10 Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.871222 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-8zj5g"] Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.873003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.897935 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-8zj5g"] Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.966872 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerID="07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb" exitCode=0 Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.966911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerDied","Data":"07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb"} Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.016792 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.016857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-svc\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.016973 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-config\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc2q\" (UniqueName: \"kubernetes.io/projected/f6d970a0-c801-4472-a3b6-eccd8335d0a8-kube-api-access-zqc2q\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-config\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc2q\" (UniqueName: \"kubernetes.io/projected/f6d970a0-c801-4472-a3b6-eccd8335d0a8-kube-api-access-zqc2q\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-svc\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.121543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.122046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-svc\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.122051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.122944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-config\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.123468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.156982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc2q\" (UniqueName: \"kubernetes.io/projected/f6d970a0-c801-4472-a3b6-eccd8335d0a8-kube-api-access-zqc2q\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.253579 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.389693 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.527883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.527964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528051 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.533682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj" (OuterVolumeSpecName: "kube-api-access-2lfrj") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "kube-api-access-2lfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.582629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config" (OuterVolumeSpecName: "config") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.583579 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.589980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.591121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.592774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630822 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630856 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630866 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630874 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630884 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630891 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.739715 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-8zj5g"] Feb 19 19:42:54 crc kubenswrapper[4722]: W0219 19:42:54.745109 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d970a0_c801_4472_a3b6_eccd8335d0a8.slice/crio-6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a WatchSource:0}: Error finding container 6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a: Status 404 returned error can't find the container with id 6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.978055 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.978172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerDied","Data":"a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4"} Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.978616 4722 scope.go:117] "RemoveContainer" containerID="07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.979844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" event={"ID":"f6d970a0-c801-4472-a3b6-eccd8335d0a8","Type":"ContainerStarted","Data":"6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a"} Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.005710 4722 scope.go:117] "RemoveContainer" containerID="2645caf8bc3502647b4c5a4dc4d97510df5ceb77697881dbc41661d5cae80579" Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.011864 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.021396 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.086940 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" path="/var/lib/kubelet/pods/dfcca6fc-5afb-464c-9852-3532ba5878a3/volumes" Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.993116 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6d970a0-c801-4472-a3b6-eccd8335d0a8" containerID="62081909198e335b4f853110f4fe5edc71f1a94287877a0e84078384c778ac69" exitCode=0 Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.993535 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" event={"ID":"f6d970a0-c801-4472-a3b6-eccd8335d0a8","Type":"ContainerDied","Data":"62081909198e335b4f853110f4fe5edc71f1a94287877a0e84078384c778ac69"} Feb 19 19:42:57 crc kubenswrapper[4722]: I0219 19:42:57.010841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" event={"ID":"f6d970a0-c801-4472-a3b6-eccd8335d0a8","Type":"ContainerStarted","Data":"a0203ad2b95afa13c3142f7de2f923065378b896fe80d57ec8368e46a4dd1048"} Feb 19 19:42:57 crc kubenswrapper[4722]: I0219 19:42:57.011217 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:57 crc kubenswrapper[4722]: I0219 19:42:57.056023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" podStartSLOduration=4.056001721 podStartE2EDuration="4.056001721s" podCreationTimestamp="2026-02-19 19:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:57.042080226 +0000 UTC m=+1476.654430580" watchObservedRunningTime="2026-02-19 19:42:57.056001721 +0000 UTC m=+1476.668352055" Feb 19 19:43:00 crc kubenswrapper[4722]: I0219 19:43:00.887375 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.256078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.323551 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.323802 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" containerID="cri-o://a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" gracePeriod=10 Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.886206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.984770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.984892 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985233 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985277 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.993500 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2" (OuterVolumeSpecName: "kube-api-access-65zd2") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "kube-api-access-65zd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.041542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.041555 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.054471 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config" (OuterVolumeSpecName: "config") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.054799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.055956 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.062010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087176 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087201 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087211 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087219 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087227 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087236 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087244 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110218 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" exitCode=0 Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110259 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerDied","Data":"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4"} Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerDied","Data":"4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd"} Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110316 4722 scope.go:117] "RemoveContainer" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110472 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.151085 4722 scope.go:117] "RemoveContainer" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.153624 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.169641 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.183319 4722 scope.go:117] "RemoveContainer" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" Feb 19 19:43:05 crc kubenswrapper[4722]: E0219 19:43:05.183727 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4\": container with ID starting with a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4 not found: ID does not exist" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.183787 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4"} err="failed to get container status \"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4\": rpc error: code = NotFound desc = could not find container \"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4\": container with ID starting with a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4 not found: ID does not exist" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.183812 4722 scope.go:117] "RemoveContainer" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" Feb 19 19:43:05 crc kubenswrapper[4722]: E0219 19:43:05.184308 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8\": container with ID starting with 82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8 not found: ID does not exist" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.184366 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8"} err="failed to get container status \"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8\": rpc error: code = NotFound desc = could not find container \"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8\": container with ID starting with 82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8 not found: ID does not exist" Feb 19 19:43:07 crc kubenswrapper[4722]: I0219 19:43:07.086041 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" path="/var/lib/kubelet/pods/7a987597-e2e2-431d-9583-01f4dc2f4ecf/volumes" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.273515 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f14785b-2e99-4110-9523-78ec32490e71" containerID="62d771eb1e8a20f3816db1e78f60944ccbfed3ae437139348353bf9a91656d8f" exitCode=0 Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.273618 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerDied","Data":"62d771eb1e8a20f3816db1e78f60944ccbfed3ae437139348353bf9a91656d8f"} Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.277707 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ac0e00c-0e1d-40fa-802d-8a77ac4c842b" containerID="b95673087d8ea4b9a6a852d0c1a317e33ef78571ef0754777ff9655eec8f3615" exitCode=0 Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.277772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerDied","Data":"b95673087d8ea4b9a6a852d0c1a317e33ef78571ef0754777ff9655eec8f3615"} Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.537739 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx"] Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538614 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538634 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538650 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538657 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538695 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538739 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538993 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.539011 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.546950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.551602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.551807 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.552010 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.555710 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx"] Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.556450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.611904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.612102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.612171 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.612228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.713868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.714447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.714621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.714683 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.728628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.729133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.730886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.741591 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.897888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.325786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerStarted","Data":"b8e281ed8c18d780a9dc36ca1a7967bf4234e515374da1f7e0eb97a781e463de"} Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.327470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.335896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerStarted","Data":"18603bed4e5f7e71e772012ca13ea8f29124d9e40c05dc8079c98faf9d74aa51"} Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.336123 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.365880 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.365856455 podStartE2EDuration="37.365856455s" podCreationTimestamp="2026-02-19 19:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:43:18.348669828 +0000 UTC m=+1497.961020152" watchObservedRunningTime="2026-02-19 19:43:18.365856455 +0000 UTC m=+1497.978206779" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.374920 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.374905277 podStartE2EDuration="37.374905277s" podCreationTimestamp="2026-02-19 19:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:43:18.368351573 +0000 UTC m=+1497.980701917" watchObservedRunningTime="2026-02-19 19:43:18.374905277 +0000 UTC m=+1497.987255601" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.461747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx"] Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.199274 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.201453 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.207970 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.348299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerStarted","Data":"80ec107254c616b2d4c87f564f37922dd485c1c585d1ed01ad5292b221ec5dfb"} Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.349648 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.349733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.350505 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.451773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.451954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.451999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.452297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.452478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.474348 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.566846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.060772 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.366876 4722 generic.go:334] "Generic (PLEG): container finished" podID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" exitCode=0 Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.367020 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316"} Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.367097 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerStarted","Data":"1e1f79d00b654bd29d0af3c012deee5c83d982006853865e13db335a0839341c"} Feb 19 19:43:22 crc kubenswrapper[4722]: I0219 19:43:22.719291 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 19 19:43:23 crc kubenswrapper[4722]: I0219 19:43:23.408979 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerStarted","Data":"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb"} Feb 19 19:43:25 crc kubenswrapper[4722]: I0219 19:43:25.433390 4722 generic.go:334] "Generic (PLEG): container finished" podID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" exitCode=0 Feb 19 19:43:25 crc kubenswrapper[4722]: I0219 19:43:25.433652 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb"} Feb 19 19:43:28 crc kubenswrapper[4722]: I0219 19:43:28.616192 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.477071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerStarted","Data":"0df2affc7967aa9fbc1883fd8ed8d42f351642cd08abaac174535f8af0673d64"} Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.479738 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerStarted","Data":"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f"} Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.498106 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" podStartSLOduration=2.349120012 podStartE2EDuration="12.498087112s" podCreationTimestamp="2026-02-19 19:43:17 +0000 UTC" firstStartedPulling="2026-02-19 19:43:18.464567408 +0000 UTC m=+1498.076917742" lastFinishedPulling="2026-02-19 19:43:28.613534518 +0000 UTC m=+1508.225884842" observedRunningTime="2026-02-19 19:43:29.490387972 +0000 UTC m=+1509.102738296" watchObservedRunningTime="2026-02-19 19:43:29.498087112 +0000 UTC m=+1509.110437436" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.523017 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfr4g" podStartSLOduration=3.335626735 podStartE2EDuration="10.52299515s" podCreationTimestamp="2026-02-19 19:43:19 +0000 UTC" firstStartedPulling="2026-02-19 19:43:21.704830479 +0000 UTC m=+1501.317180803" lastFinishedPulling="2026-02-19 19:43:28.892198894 +0000 UTC m=+1508.504549218" observedRunningTime="2026-02-19 19:43:29.522044841 +0000 UTC m=+1509.134395165" watchObservedRunningTime="2026-02-19 19:43:29.52299515 +0000 UTC m=+1509.135345474" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.568256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.568394 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:30 crc kubenswrapper[4722]: I0219 19:43:30.616359 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lfr4g" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" probeResult="failure" output=< Feb 19 19:43:30 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:43:30 crc kubenswrapper[4722]: > Feb 19 19:43:32 crc kubenswrapper[4722]: I0219 19:43:32.098377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 19:43:32 crc kubenswrapper[4722]: I0219 19:43:32.518302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:43:35 crc kubenswrapper[4722]: I0219 19:43:35.132574 4722 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7a987597-e2e2-431d-9583-01f4dc2f4ecf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7a987597-e2e2-431d-9583-01f4dc2f4ecf] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7a987597_e2e2_431d_9583_01f4dc2f4ecf.slice" Feb 19 19:43:39 crc kubenswrapper[4722]: I0219 19:43:39.630128 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:39 crc kubenswrapper[4722]: I0219 19:43:39.689598 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:39 crc kubenswrapper[4722]: I0219 19:43:39.861972 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.614715 4722 generic.go:334] "Generic (PLEG): container finished" podID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerID="0df2affc7967aa9fbc1883fd8ed8d42f351642cd08abaac174535f8af0673d64" exitCode=0 Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.614798 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerDied","Data":"0df2affc7967aa9fbc1883fd8ed8d42f351642cd08abaac174535f8af0673d64"} Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.615183 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lfr4g" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" containerID="cri-o://9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" gracePeriod=2 Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.923510 4722 scope.go:117] "RemoveContainer" containerID="ec6e9a5d8db1ce9bec823742a602001b48238109f03304859ab2fe4f5a1aeb10" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.044690 4722 scope.go:117] "RemoveContainer" containerID="044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.147335 4722 scope.go:117] "RemoveContainer" containerID="62cc34e349902eca38fc94fdcd77006a8905ea0cb9cbb3392c7d1c40da4629fc" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.328062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.414907 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"03661e8e-c7dc-4b7a-b463-8ef04af17523\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.415138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"03661e8e-c7dc-4b7a-b463-8ef04af17523\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.415203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"03661e8e-c7dc-4b7a-b463-8ef04af17523\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.416027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities" (OuterVolumeSpecName: "utilities") pod "03661e8e-c7dc-4b7a-b463-8ef04af17523" (UID: "03661e8e-c7dc-4b7a-b463-8ef04af17523"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.421031 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2" (OuterVolumeSpecName: "kube-api-access-wbbt2") pod "03661e8e-c7dc-4b7a-b463-8ef04af17523" (UID: "03661e8e-c7dc-4b7a-b463-8ef04af17523"). InnerVolumeSpecName "kube-api-access-wbbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.466402 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03661e8e-c7dc-4b7a-b463-8ef04af17523" (UID: "03661e8e-c7dc-4b7a-b463-8ef04af17523"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.517262 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.517290 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.517301 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.636395 4722 generic.go:334] "Generic (PLEG): container finished" podID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" exitCode=0 Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.637636 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.638010 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f"} Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.638064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"1e1f79d00b654bd29d0af3c012deee5c83d982006853865e13db335a0839341c"} Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.638096 4722 scope.go:117] "RemoveContainer" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.671338 4722 scope.go:117] "RemoveContainer" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.687861 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.700355 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.712337 4722 scope.go:117] "RemoveContainer" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.742797 4722 scope.go:117] "RemoveContainer" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" Feb 19 19:43:42 crc kubenswrapper[4722]: E0219 19:43:42.743744 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f\": container with ID starting with 9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f not found: ID does not exist" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.743786 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f"} err="failed to get container status \"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f\": rpc error: code = NotFound desc = could not find container \"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f\": container with ID starting with 9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f not found: ID does not exist" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.743814 4722 scope.go:117] "RemoveContainer" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" Feb 19 19:43:42 crc kubenswrapper[4722]: E0219 19:43:42.744231 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb\": container with ID starting with 74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb not found: ID does not exist" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.744272 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb"} err="failed to get container status \"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb\": rpc error: code = NotFound desc = could not find container \"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb\": container with ID starting with 74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb not found: ID does not exist" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.744305 4722 scope.go:117] "RemoveContainer" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" Feb 19 19:43:42 crc kubenswrapper[4722]: E0219 19:43:42.744750 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316\": container with ID starting with 9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316 not found: ID does not exist" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.744783 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316"} err="failed to get container status \"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316\": rpc error: code = NotFound desc = could not find container \"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316\": container with ID starting with 9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316 not found: ID does not exist" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.083348 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" path="/var/lib/kubelet/pods/03661e8e-c7dc-4b7a-b463-8ef04af17523/volumes" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.149086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255697 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.260689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd" (OuterVolumeSpecName: "kube-api-access-6jbwd") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "kube-api-access-6jbwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.262379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.291106 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.305277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory" (OuterVolumeSpecName: "inventory") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358404 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358452 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358468 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358480 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.656695 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerDied","Data":"80ec107254c616b2d4c87f564f37922dd485c1c585d1ed01ad5292b221ec5dfb"} Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.656746 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ec107254c616b2d4c87f564f37922dd485c1c585d1ed01ad5292b221ec5dfb" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.656713 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.719827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52"] Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720221 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-content" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720238 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-content" Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720250 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-utilities" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720258 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-utilities" Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720278 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720284 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720313 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720321 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720500 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720524 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.721264 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.731818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52"] Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.733387 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.733678 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.733867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.739102 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.872660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.872927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.873069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.975608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.975669 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.975693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.979122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.979678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.000376 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.096776 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.629609 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52"] Feb 19 19:43:44 crc kubenswrapper[4722]: W0219 19:43:44.638027 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2554051_f8a8_413e_b352_13ac8f88da63.slice/crio-3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de WatchSource:0}: Error finding container 3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de: Status 404 returned error can't find the container with id 3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.672733 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerStarted","Data":"3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de"} Feb 19 19:43:45 crc kubenswrapper[4722]: I0219 19:43:45.724971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerStarted","Data":"20998a5e9b0bbea9ed5fc65c67fcd47b1526c1aa9ab0f5cb2c3accf655d0120e"} Feb 19 19:43:45 crc kubenswrapper[4722]: I0219 19:43:45.753643 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" podStartSLOduration=2.358406764 podStartE2EDuration="2.753622511s" podCreationTimestamp="2026-02-19 19:43:43 +0000 UTC" firstStartedPulling="2026-02-19 19:43:44.641000141 +0000 UTC m=+1524.253350465" lastFinishedPulling="2026-02-19 19:43:45.036215878 +0000 UTC m=+1524.648566212" observedRunningTime="2026-02-19 19:43:45.750195834 +0000 UTC m=+1525.362546188" watchObservedRunningTime="2026-02-19 19:43:45.753622511 +0000 UTC m=+1525.365972835" Feb 19 19:43:47 crc kubenswrapper[4722]: I0219 19:43:47.748741 4722 generic.go:334] "Generic (PLEG): container finished" podID="d2554051-f8a8-413e-b352-13ac8f88da63" containerID="20998a5e9b0bbea9ed5fc65c67fcd47b1526c1aa9ab0f5cb2c3accf655d0120e" exitCode=0 Feb 19 19:43:47 crc kubenswrapper[4722]: I0219 19:43:47.748855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerDied","Data":"20998a5e9b0bbea9ed5fc65c67fcd47b1526c1aa9ab0f5cb2c3accf655d0120e"} Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.282432 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.393019 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"d2554051-f8a8-413e-b352-13ac8f88da63\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.393190 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"d2554051-f8a8-413e-b352-13ac8f88da63\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.393242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"d2554051-f8a8-413e-b352-13ac8f88da63\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.399076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9" (OuterVolumeSpecName: "kube-api-access-clwl9") pod "d2554051-f8a8-413e-b352-13ac8f88da63" (UID: "d2554051-f8a8-413e-b352-13ac8f88da63"). InnerVolumeSpecName "kube-api-access-clwl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.424624 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory" (OuterVolumeSpecName: "inventory") pod "d2554051-f8a8-413e-b352-13ac8f88da63" (UID: "d2554051-f8a8-413e-b352-13ac8f88da63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.429560 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d2554051-f8a8-413e-b352-13ac8f88da63" (UID: "d2554051-f8a8-413e-b352-13ac8f88da63"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.495652 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.495692 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.495703 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.774809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerDied","Data":"3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de"} Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.775145 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.774916 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.868622 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4"] Feb 19 19:43:49 crc kubenswrapper[4722]: E0219 19:43:49.869223 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2554051-f8a8-413e-b352-13ac8f88da63" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.869254 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2554051-f8a8-413e-b352-13ac8f88da63" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.869515 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2554051-f8a8-413e-b352-13ac8f88da63" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.870425 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.874714 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.875001 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.875080 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.875265 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.889537 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4"] Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.115836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.115996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.116128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.128945 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.194920 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.753334 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4"] Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.786877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerStarted","Data":"6f9d6eb7165cf38cb4798873ca7e4eb22283d0374e3333f832035a7b8aca2450"} Feb 19 19:43:51 crc kubenswrapper[4722]: I0219 19:43:51.798594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerStarted","Data":"ed27796d3d25748986df212c115b101ecfee62c9f9764796d2d2ee4e35289aef"} Feb 19 19:43:51 crc kubenswrapper[4722]: I0219 19:43:51.818789 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" podStartSLOduration=2.3963085570000002 podStartE2EDuration="2.818769465s" podCreationTimestamp="2026-02-19 19:43:49 +0000 UTC" firstStartedPulling="2026-02-19 19:43:50.748441197 +0000 UTC m=+1530.360791521" lastFinishedPulling="2026-02-19 19:43:51.170902105 +0000 UTC m=+1530.783252429" observedRunningTime="2026-02-19 19:43:51.818233678 +0000 UTC m=+1531.430584002" watchObservedRunningTime="2026-02-19 19:43:51.818769465 +0000 UTC m=+1531.431119789" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.519620 4722 scope.go:117] "RemoveContainer" containerID="0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.581509 4722 scope.go:117] "RemoveContainer" containerID="39d3bd74fcad2b2ba6a5d3be195f9ef849a5a1caabbd2723eb1f1b100ba3c28c" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.617545 4722 scope.go:117] "RemoveContainer" containerID="3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.697602 4722 scope.go:117] "RemoveContainer" containerID="6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.732265 4722 scope.go:117] "RemoveContainer" containerID="0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.785690 4722 scope.go:117] "RemoveContainer" containerID="8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.826092 4722 scope.go:117] "RemoveContainer" containerID="8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.862357 4722 scope.go:117] "RemoveContainer" containerID="26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.881603 4722 scope.go:117] "RemoveContainer" containerID="df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.904971 4722 scope.go:117] "RemoveContainer" containerID="4dec94c6774384698a0cf861b554d74fb1ddd8514338b3e11d17056ce861d124" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.925942 4722 scope.go:117] "RemoveContainer" containerID="17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.159215 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm"] Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.161396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.164084 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.164625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.179690 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm"] Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.328249 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.328485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.328575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.430280 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.430404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.430491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.431309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.438760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.450258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.494984 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.990333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm"] Feb 19 19:45:01 crc kubenswrapper[4722]: I0219 19:45:01.577741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerStarted","Data":"75756214ecf739e6539c33ec90e775742f12ea9cf526026780602fab4300d835"} Feb 19 19:45:01 crc kubenswrapper[4722]: I0219 19:45:01.578051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerStarted","Data":"2282f5e8c1fe4ccd890b8006981551ba2548f12c130fc4a075dd47c446dd0b2b"} Feb 19 19:45:01 crc kubenswrapper[4722]: I0219 19:45:01.599405 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" podStartSLOduration=1.599291807 podStartE2EDuration="1.599291807s" podCreationTimestamp="2026-02-19 19:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:45:01.594010602 +0000 UTC m=+1601.206360936" watchObservedRunningTime="2026-02-19 19:45:01.599291807 +0000 UTC m=+1601.211642131" Feb 19 19:45:02 crc kubenswrapper[4722]: I0219 19:45:02.588007 4722 generic.go:334] "Generic (PLEG): container finished" podID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerID="75756214ecf739e6539c33ec90e775742f12ea9cf526026780602fab4300d835" exitCode=0 Feb 19 19:45:02 crc kubenswrapper[4722]: I0219 19:45:02.588406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerDied","Data":"75756214ecf739e6539c33ec90e775742f12ea9cf526026780602fab4300d835"} Feb 19 19:45:03 crc kubenswrapper[4722]: I0219 19:45:03.995800 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.105269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.105577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.105824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.106349 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume" (OuterVolumeSpecName: "config-volume") pod "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" (UID: "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.106868 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.111065 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" (UID: "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.111371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8" (OuterVolumeSpecName: "kube-api-access-d6kz8") pod "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" (UID: "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6"). InnerVolumeSpecName "kube-api-access-d6kz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.209067 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.209109 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.613907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerDied","Data":"2282f5e8c1fe4ccd890b8006981551ba2548f12c130fc4a075dd47c446dd0b2b"} Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.613955 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2282f5e8c1fe4ccd890b8006981551ba2548f12c130fc4a075dd47c446dd0b2b" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.614019 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:11 crc kubenswrapper[4722]: I0219 19:45:11.798992 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:45:11 crc kubenswrapper[4722]: I0219 19:45:11.799643 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:45:41 crc kubenswrapper[4722]: I0219 19:45:41.798257 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:45:41 crc kubenswrapper[4722]: I0219 19:45:41.798662 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.271605 4722 scope.go:117] "RemoveContainer" containerID="2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.296002 4722 scope.go:117] "RemoveContainer" containerID="0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.315795 4722 scope.go:117] "RemoveContainer" containerID="13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.342591 4722 scope.go:117] "RemoveContainer" containerID="8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.392319 4722 scope.go:117] "RemoveContainer" containerID="14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.798593 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799111 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799917 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799964 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" gracePeriod=600 Feb 19 19:46:11 crc kubenswrapper[4722]: E0219 19:46:11.987544 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.524524 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" exitCode=0 Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.524571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc"} Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.524619 4722 scope.go:117] "RemoveContainer" containerID="5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17" Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.525660 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:12 crc kubenswrapper[4722]: E0219 19:46:12.526632 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:26 crc kubenswrapper[4722]: I0219 19:46:26.071720 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:26 crc kubenswrapper[4722]: E0219 19:46:26.072489 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:37 crc kubenswrapper[4722]: I0219 19:46:37.072416 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:37 crc kubenswrapper[4722]: E0219 19:46:37.073650 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:43 crc kubenswrapper[4722]: I0219 19:46:43.510558 4722 scope.go:117] "RemoveContainer" containerID="a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56" Feb 19 19:46:43 crc kubenswrapper[4722]: I0219 19:46:43.538971 4722 scope.go:117] "RemoveContainer" containerID="1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0" Feb 19 19:46:47 crc kubenswrapper[4722]: I0219 19:46:47.909897 4722 generic.go:334] "Generic (PLEG): container finished" podID="7573aaf8-263a-4e50-84da-58cf311829a9" containerID="ed27796d3d25748986df212c115b101ecfee62c9f9764796d2d2ee4e35289aef" exitCode=0 Feb 19 19:46:47 crc kubenswrapper[4722]: I0219 19:46:47.909976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerDied","Data":"ed27796d3d25748986df212c115b101ecfee62c9f9764796d2d2ee4e35289aef"} Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.421303 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.570635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.571137 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.571207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.571964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.576050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.580942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd" (OuterVolumeSpecName: "kube-api-access-mm4vd") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "kube-api-access-mm4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.597384 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory" (OuterVolumeSpecName: "inventory") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.607230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675285 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675320 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675331 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675341 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.926659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerDied","Data":"6f9d6eb7165cf38cb4798873ca7e4eb22283d0374e3333f832035a7b8aca2450"} Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.926698 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f9d6eb7165cf38cb4798873ca7e4eb22283d0374e3333f832035a7b8aca2450" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.926759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.032642 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8"] Feb 19 19:46:50 crc kubenswrapper[4722]: E0219 19:46:50.033226 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerName="collect-profiles" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033255 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerName="collect-profiles" Feb 19 19:46:50 crc kubenswrapper[4722]: E0219 19:46:50.033300 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7573aaf8-263a-4e50-84da-58cf311829a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033310 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7573aaf8-263a-4e50-84da-58cf311829a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033549 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7573aaf8-263a-4e50-84da-58cf311829a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerName="collect-profiles" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.034476 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.039667 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.039885 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.040035 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.040194 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.048203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8"] Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.191431 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.191623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.191687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.293909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.293984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.294035 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.300357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.301069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.316707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.353472 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.909900 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8"] Feb 19 19:46:50 crc kubenswrapper[4722]: W0219 19:46:50.912827 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a67d89_596c_44f0_b19d_dc5d1eb3021e.slice/crio-648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda WatchSource:0}: Error finding container 648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda: Status 404 returned error can't find the container with id 648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.916477 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.937333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerStarted","Data":"648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda"} Feb 19 19:46:51 crc kubenswrapper[4722]: I0219 19:46:51.082991 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:51 crc kubenswrapper[4722]: E0219 19:46:51.083698 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:51 crc kubenswrapper[4722]: I0219 19:46:51.952368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerStarted","Data":"5a4c6d10bcfa0da53cc4b9e38924013e1f28f0ece8007cef9ebd1b78c76f2e64"} Feb 19 19:46:51 crc kubenswrapper[4722]: I0219 19:46:51.980980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" podStartSLOduration=1.585481175 podStartE2EDuration="1.980949241s" podCreationTimestamp="2026-02-19 19:46:50 +0000 UTC" firstStartedPulling="2026-02-19 19:46:50.916184916 +0000 UTC m=+1710.528535260" lastFinishedPulling="2026-02-19 19:46:51.311652992 +0000 UTC m=+1710.924003326" observedRunningTime="2026-02-19 19:46:51.976626276 +0000 UTC m=+1711.588976640" watchObservedRunningTime="2026-02-19 19:46:51.980949241 +0000 UTC m=+1711.593299605" Feb 19 19:47:02 crc kubenswrapper[4722]: I0219 19:47:02.071590 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:02 crc kubenswrapper[4722]: E0219 19:47:02.072687 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:15 crc kubenswrapper[4722]: I0219 19:47:15.071633 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:15 crc kubenswrapper[4722]: E0219 19:47:15.072429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:28 crc kubenswrapper[4722]: I0219 19:47:28.067697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:47:28 crc kubenswrapper[4722]: I0219 19:47:28.085084 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:47:29 crc kubenswrapper[4722]: I0219 19:47:29.071607 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:29 crc kubenswrapper[4722]: E0219 19:47:29.071946 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:29 crc kubenswrapper[4722]: I0219 19:47:29.083983 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" path="/var/lib/kubelet/pods/1f940a76-c93f-46c5-af29-5b098a54adc8/volumes" Feb 19 19:47:30 crc kubenswrapper[4722]: I0219 19:47:30.037556 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:47:30 crc kubenswrapper[4722]: I0219 19:47:30.046945 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:47:31 crc kubenswrapper[4722]: I0219 19:47:31.083530 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" path="/var/lib/kubelet/pods/03387e77-59d8-4377-9a1c-dac948d84b59/volumes" Feb 19 19:47:36 crc kubenswrapper[4722]: I0219 19:47:36.046287 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:47:36 crc kubenswrapper[4722]: I0219 19:47:36.062454 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.034848 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.047345 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.058737 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.069640 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.085561 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" path="/var/lib/kubelet/pods/5bd3ad13-0324-4c1c-9b74-eb1401f06507/volumes" Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.086543 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93536b6f-8176-4737-a547-9face2995981" path="/var/lib/kubelet/pods/93536b6f-8176-4737-a547-9face2995981/volumes" Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.087232 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.090538 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:47:39 crc kubenswrapper[4722]: I0219 19:47:39.084536 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" path="/var/lib/kubelet/pods/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358/volumes" Feb 19 19:47:39 crc kubenswrapper[4722]: I0219 19:47:39.085134 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44afb335-8449-4492-a772-78889877810e" path="/var/lib/kubelet/pods/44afb335-8449-4492-a772-78889877810e/volumes" Feb 19 19:47:40 crc kubenswrapper[4722]: I0219 19:47:40.031980 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:47:40 crc kubenswrapper[4722]: I0219 19:47:40.045174 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:47:41 crc kubenswrapper[4722]: I0219 19:47:41.083996 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" path="/var/lib/kubelet/pods/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8/volumes" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.072231 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:43 crc kubenswrapper[4722]: E0219 19:47:43.072768 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.591274 4722 scope.go:117] "RemoveContainer" containerID="99c98b71002ac8948511844b6989a0da14ae66e034112843908355f3a72c44e7" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.624819 4722 scope.go:117] "RemoveContainer" containerID="8aa3bea30fad3f939a077228a9ed1250c050038afc03ce315c796a876ab91692" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.690962 4722 scope.go:117] "RemoveContainer" containerID="65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.746695 4722 scope.go:117] "RemoveContainer" containerID="d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.809243 4722 scope.go:117] "RemoveContainer" containerID="2e209875892b5272f7bb00341b24fa8e6b2be48cf1bccfa8acb4859e6aeca425" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.839484 4722 scope.go:117] "RemoveContainer" containerID="e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.883992 4722 scope.go:117] "RemoveContainer" containerID="346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.902548 4722 scope.go:117] "RemoveContainer" containerID="c2f010a6f9fb7a90aca42363ebf34cb5a6a44700de8e1351f8ac807b74981bd2" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.930065 4722 scope.go:117] "RemoveContainer" containerID="33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.955226 4722 scope.go:117] "RemoveContainer" containerID="687c2f6cd621666c11c3a553d69b13af20c5311d98a27db188d1d7153219352e" Feb 19 19:47:50 crc kubenswrapper[4722]: I0219 19:47:50.058728 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:47:50 crc kubenswrapper[4722]: I0219 19:47:50.071789 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:47:51 crc kubenswrapper[4722]: I0219 19:47:51.086726 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" path="/var/lib/kubelet/pods/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92/volumes" Feb 19 19:47:55 crc kubenswrapper[4722]: I0219 19:47:55.072350 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:55 crc kubenswrapper[4722]: E0219 19:47:55.073239 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.053484 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.146344 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.157205 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.166197 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.181236 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.192385 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.199220 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.224226 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.230224 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.240546 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.253140 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.262209 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.271196 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.280076 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.028653 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.037712 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.085604 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" path="/var/lib/kubelet/pods/2039a569-0bc4-49a4-9e82-08964729dc7b/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.087506 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" path="/var/lib/kubelet/pods/217ea569-e058-4f21-bbb7-d2f2648375eb/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.088083 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" path="/var/lib/kubelet/pods/25905c52-4074-40d4-826f-ef89353eeaa6/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.088677 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" path="/var/lib/kubelet/pods/619d59b3-6514-4648-9007-6e9ce3427c3a/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.089924 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" path="/var/lib/kubelet/pods/a8d81d51-f4b7-4dec-9548-982de19b4742/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.090477 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" path="/var/lib/kubelet/pods/c78d063e-7cd7-4b41-b148-1a7f9a3f9914/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.091077 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" path="/var/lib/kubelet/pods/d5778eec-eb7e-4137-85bd-761ac78b9fd7/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.092263 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" path="/var/lib/kubelet/pods/fe445148-46c0-4e8c-844a-51a5ce323370/volumes" Feb 19 19:48:02 crc kubenswrapper[4722]: I0219 19:48:02.029242 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:48:02 crc kubenswrapper[4722]: I0219 19:48:02.040633 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:48:03 crc kubenswrapper[4722]: I0219 19:48:03.085380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" path="/var/lib/kubelet/pods/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb/volumes" Feb 19 19:48:07 crc kubenswrapper[4722]: I0219 19:48:07.071490 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:07 crc kubenswrapper[4722]: E0219 19:48:07.072139 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:18 crc kubenswrapper[4722]: I0219 19:48:18.071263 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:18 crc kubenswrapper[4722]: E0219 19:48:18.072055 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:29 crc kubenswrapper[4722]: I0219 19:48:29.071739 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:29 crc kubenswrapper[4722]: E0219 19:48:29.072637 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:32 crc kubenswrapper[4722]: I0219 19:48:32.044973 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:48:32 crc kubenswrapper[4722]: I0219 19:48:32.055544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:48:33 crc kubenswrapper[4722]: I0219 19:48:33.084414 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" path="/var/lib/kubelet/pods/eab1ce59-2254-419a-bab0-cf5e87888634/volumes" Feb 19 19:48:33 crc kubenswrapper[4722]: I0219 19:48:33.673496 4722 generic.go:334] "Generic (PLEG): container finished" podID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerID="5a4c6d10bcfa0da53cc4b9e38924013e1f28f0ece8007cef9ebd1b78c76f2e64" exitCode=0 Feb 19 19:48:33 crc kubenswrapper[4722]: I0219 19:48:33.673621 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerDied","Data":"5a4c6d10bcfa0da53cc4b9e38924013e1f28f0ece8007cef9ebd1b78c76f2e64"} Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.240854 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.434947 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.435197 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.435332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.440891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw" (OuterVolumeSpecName: "kube-api-access-w26gw") pod "23a67d89-596c-44f0-b19d-dc5d1eb3021e" (UID: "23a67d89-596c-44f0-b19d-dc5d1eb3021e"). InnerVolumeSpecName "kube-api-access-w26gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.465757 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23a67d89-596c-44f0-b19d-dc5d1eb3021e" (UID: "23a67d89-596c-44f0-b19d-dc5d1eb3021e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.469098 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory" (OuterVolumeSpecName: "inventory") pod "23a67d89-596c-44f0-b19d-dc5d1eb3021e" (UID: "23a67d89-596c-44f0-b19d-dc5d1eb3021e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.538419 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.538465 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") on node \"crc\" DevicePath \"\"" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.538478 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.707205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerDied","Data":"648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda"} Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.708094 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.707296 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.800728 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x"] Feb 19 19:48:35 crc kubenswrapper[4722]: E0219 19:48:35.802560 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.802599 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.802842 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.803806 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.806079 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.807100 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.807127 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.807255 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.814192 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x"] Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.845651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.845880 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.845914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.947696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.947755 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.947878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.960853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.960876 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.969997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:36 crc kubenswrapper[4722]: I0219 19:48:36.123407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:36 crc kubenswrapper[4722]: I0219 19:48:36.677925 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x"] Feb 19 19:48:36 crc kubenswrapper[4722]: I0219 19:48:36.721088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerStarted","Data":"fc864cb4377eebf89734a22ccaa77d70ed0e86aab196f4dc6ade6eea7c72341d"} Feb 19 19:48:37 crc kubenswrapper[4722]: I0219 19:48:37.732699 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerStarted","Data":"35295d18877f348be9df39c76a49295d5c3dcfa4c41c129460eba234068337c9"} Feb 19 19:48:37 crc kubenswrapper[4722]: I0219 19:48:37.758743 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" podStartSLOduration=2.326646943 podStartE2EDuration="2.758725407s" podCreationTimestamp="2026-02-19 19:48:35 +0000 UTC" firstStartedPulling="2026-02-19 19:48:36.683847138 +0000 UTC m=+1816.296197462" lastFinishedPulling="2026-02-19 19:48:37.115925602 +0000 UTC m=+1816.728275926" observedRunningTime="2026-02-19 19:48:37.747227686 +0000 UTC m=+1817.359578010" watchObservedRunningTime="2026-02-19 19:48:37.758725407 +0000 UTC m=+1817.371075721" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.071121 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:44 crc kubenswrapper[4722]: E0219 19:48:44.071918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.205075 4722 scope.go:117] "RemoveContainer" containerID="bb275fbcbbe35a94955e26075778ab6128134f99af8b8d18b788e7b11aac61c6" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.235952 4722 scope.go:117] "RemoveContainer" containerID="bf1fddeb0ef2831ba2e02a1aa709a530121f690fbf768791dd2408b9c18e9009" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.282051 4722 scope.go:117] "RemoveContainer" containerID="42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.328327 4722 scope.go:117] "RemoveContainer" containerID="a4f4b237835194ac1fcedd350c7532fc74f42e672c498f5c9cea05272f6986a0" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.396143 4722 scope.go:117] "RemoveContainer" containerID="a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.463529 4722 scope.go:117] "RemoveContainer" containerID="6c2e2442beaae76dbd599637b272c7eae6a58710a3bb17eed3e61507df9ea9e0" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.487088 4722 scope.go:117] "RemoveContainer" containerID="8c73c8e1b7d4896f7ab7a5272b3c22c63e7d90ad3033ca9be834b667cd882b7f" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.523723 4722 scope.go:117] "RemoveContainer" containerID="3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.554031 4722 scope.go:117] "RemoveContainer" containerID="6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.591636 4722 scope.go:117] "RemoveContainer" containerID="24deafb2187b5509b9a503b5cde68eab414e437eef2f36f8141214811c39e398" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.624425 4722 scope.go:117] "RemoveContainer" containerID="6d49fd861306d1a47364956e09d02157a9618a565198ef080d63694bf02fdc31" Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.042089 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.056013 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.067025 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.085048 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" path="/var/lib/kubelet/pods/6175472a-2fd6-4b07-bcb1-4e441a4587aa/volumes" Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.085648 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:48:53 crc kubenswrapper[4722]: I0219 19:48:53.092018 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" path="/var/lib/kubelet/pods/41216a8d-32f8-4ec6-ab65-5474453cad03/volumes" Feb 19 19:48:56 crc kubenswrapper[4722]: I0219 19:48:56.071197 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:56 crc kubenswrapper[4722]: E0219 19:48:56.071855 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:01 crc kubenswrapper[4722]: I0219 19:49:01.035249 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:49:01 crc kubenswrapper[4722]: I0219 19:49:01.045027 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:49:01 crc kubenswrapper[4722]: I0219 19:49:01.082248 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" path="/var/lib/kubelet/pods/9c2453a9-4c81-4256-b52d-edb69c12c7d7/volumes" Feb 19 19:49:02 crc kubenswrapper[4722]: I0219 19:49:02.038222 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:49:02 crc kubenswrapper[4722]: I0219 19:49:02.048596 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:49:03 crc kubenswrapper[4722]: I0219 19:49:03.085899 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" path="/var/lib/kubelet/pods/512a4c5e-3ea6-42a8-9f83-8c0e5375891d/volumes" Feb 19 19:49:11 crc kubenswrapper[4722]: I0219 19:49:11.084575 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:11 crc kubenswrapper[4722]: E0219 19:49:11.086424 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:23 crc kubenswrapper[4722]: I0219 19:49:23.073066 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:23 crc kubenswrapper[4722]: E0219 19:49:23.073851 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:37 crc kubenswrapper[4722]: I0219 19:49:37.071367 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:37 crc kubenswrapper[4722]: E0219 19:49:37.072295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:44 crc kubenswrapper[4722]: I0219 19:49:44.893295 4722 scope.go:117] "RemoveContainer" containerID="30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd" Feb 19 19:49:44 crc kubenswrapper[4722]: I0219 19:49:44.928872 4722 scope.go:117] "RemoveContainer" containerID="90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6" Feb 19 19:49:44 crc kubenswrapper[4722]: I0219 19:49:44.988992 4722 scope.go:117] "RemoveContainer" containerID="fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6" Feb 19 19:49:45 crc kubenswrapper[4722]: I0219 19:49:45.040079 4722 scope.go:117] "RemoveContainer" containerID="c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee" Feb 19 19:49:49 crc kubenswrapper[4722]: I0219 19:49:49.601849 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerID="35295d18877f348be9df39c76a49295d5c3dcfa4c41c129460eba234068337c9" exitCode=0 Feb 19 19:49:49 crc kubenswrapper[4722]: I0219 19:49:49.601938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerDied","Data":"35295d18877f348be9df39c76a49295d5c3dcfa4c41c129460eba234068337c9"} Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.140119 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.201403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.201480 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.201638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.208493 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6" (OuterVolumeSpecName: "kube-api-access-zzxv6") pod "7a9a8806-dadf-4cd5-af24-fc35c7e52197" (UID: "7a9a8806-dadf-4cd5-af24-fc35c7e52197"). InnerVolumeSpecName "kube-api-access-zzxv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.243375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory" (OuterVolumeSpecName: "inventory") pod "7a9a8806-dadf-4cd5-af24-fc35c7e52197" (UID: "7a9a8806-dadf-4cd5-af24-fc35c7e52197"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.245618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7a9a8806-dadf-4cd5-af24-fc35c7e52197" (UID: "7a9a8806-dadf-4cd5-af24-fc35c7e52197"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.303978 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.304016 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.304039 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.639327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerDied","Data":"fc864cb4377eebf89734a22ccaa77d70ed0e86aab196f4dc6ade6eea7c72341d"} Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.639416 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc864cb4377eebf89734a22ccaa77d70ed0e86aab196f4dc6ade6eea7c72341d" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.639414 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.720603 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v"] Feb 19 19:49:51 crc kubenswrapper[4722]: E0219 19:49:51.721170 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.721196 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.721450 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.722399 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.729100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.729537 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.729722 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.730238 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.733298 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v"] Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.813246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.813307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.813367 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.915356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.915725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.915773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.919542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.919646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.933947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.041458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.071593 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:52 crc kubenswrapper[4722]: E0219 19:49:52.072036 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.564962 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v"] Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.651003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerStarted","Data":"86af48b9e83d7d95adbe2df6fe8802abce22c90db515ffce0069648800fa485e"} Feb 19 19:49:53 crc kubenswrapper[4722]: I0219 19:49:53.659768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerStarted","Data":"1c1bf7a6160f737df17b0d7983c1732ea91ec594a77204a5e21603edacd16db0"} Feb 19 19:49:53 crc kubenswrapper[4722]: I0219 19:49:53.682734 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" podStartSLOduration=2.151315433 podStartE2EDuration="2.682711369s" podCreationTimestamp="2026-02-19 19:49:51 +0000 UTC" firstStartedPulling="2026-02-19 19:49:52.567229232 +0000 UTC m=+1892.179579556" lastFinishedPulling="2026-02-19 19:49:53.098625168 +0000 UTC m=+1892.710975492" observedRunningTime="2026-02-19 19:49:53.679805572 +0000 UTC m=+1893.292155906" watchObservedRunningTime="2026-02-19 19:49:53.682711369 +0000 UTC m=+1893.295061693" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.694493 4722 generic.go:334] "Generic (PLEG): container finished" podID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerID="1c1bf7a6160f737df17b0d7983c1732ea91ec594a77204a5e21603edacd16db0" exitCode=0 Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.694859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerDied","Data":"1c1bf7a6160f737df17b0d7983c1732ea91ec594a77204a5e21603edacd16db0"} Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.701521 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.704528 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.709966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.732162 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.732349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.732394 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.834912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.884241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:58 crc kubenswrapper[4722]: I0219 19:49:58.022197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:58 crc kubenswrapper[4722]: I0219 19:49:58.485054 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:49:58 crc kubenswrapper[4722]: I0219 19:49:58.705911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerStarted","Data":"fc6bc749ec8e5faa86281dd9afaa400f15102b634bc38df4e83908220e9f8b43"} Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.191933 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.263635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.263743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.263778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.278735 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm" (OuterVolumeSpecName: "kube-api-access-gm9tm") pod "b51489f6-90e0-4a0d-ae54-24eb1e6f5568" (UID: "b51489f6-90e0-4a0d-ae54-24eb1e6f5568"). InnerVolumeSpecName "kube-api-access-gm9tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.288566 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.330959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory" (OuterVolumeSpecName: "inventory") pod "b51489f6-90e0-4a0d-ae54-24eb1e6f5568" (UID: "b51489f6-90e0-4a0d-ae54-24eb1e6f5568"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.333334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b51489f6-90e0-4a0d-ae54-24eb1e6f5568" (UID: "b51489f6-90e0-4a0d-ae54-24eb1e6f5568"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.391184 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.391243 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.481302 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:49:59 crc kubenswrapper[4722]: E0219 19:49:59.482023 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.482051 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.482321 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.484376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.494167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.595224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.595479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.595662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.698099 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.698200 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.698305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.699095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.699352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.716486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.717214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerDied","Data":"86af48b9e83d7d95adbe2df6fe8802abce22c90db515ffce0069648800fa485e"} Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.717303 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86af48b9e83d7d95adbe2df6fe8802abce22c90db515ffce0069648800fa485e" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.717287 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.719244 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" exitCode=0 Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.719278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2"} Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.807201 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82"] Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.808943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.811632 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.811784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.811897 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.820133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82"] Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.822268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.833360 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.141234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.141273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.141383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.242903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.243088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.243113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.251139 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.252463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.265931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.348110 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.643630 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.729225 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerStarted","Data":"5994d3f94bd355ac82deb7f5cfe2381af3489b711b13ce6948c137e7c6fdadf4"} Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.247069 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82"] Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.739582 4722 generic.go:334] "Generic (PLEG): container finished" podID="1834c92c-87c2-44ab-acda-1170f3a92303" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" exitCode=0 Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.739628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e"} Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.741667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerStarted","Data":"bdc4d0bdb826bb0e1954e08f60e022a246a6429f3743c4c5ecdb3bc0104f4b0e"} Feb 19 19:50:03 crc kubenswrapper[4722]: I0219 19:50:03.720437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerStarted","Data":"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf"} Feb 19 19:50:04 crc kubenswrapper[4722]: I0219 19:50:04.730886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerStarted","Data":"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c"} Feb 19 19:50:04 crc kubenswrapper[4722]: I0219 19:50:04.732407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerStarted","Data":"8ef5eb3908214f96f5d6505146ed47b1529675834981d6c6dfbef8f12a789667"} Feb 19 19:50:04 crc kubenswrapper[4722]: I0219 19:50:04.882567 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" podStartSLOduration=3.291077917 podStartE2EDuration="5.882536618s" podCreationTimestamp="2026-02-19 19:49:59 +0000 UTC" firstStartedPulling="2026-02-19 19:50:01.237614989 +0000 UTC m=+1900.849965313" lastFinishedPulling="2026-02-19 19:50:03.82907369 +0000 UTC m=+1903.441424014" observedRunningTime="2026-02-19 19:50:04.878964376 +0000 UTC m=+1904.491314700" watchObservedRunningTime="2026-02-19 19:50:04.882536618 +0000 UTC m=+1904.494886942" Feb 19 19:50:06 crc kubenswrapper[4722]: I0219 19:50:06.071796 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:06 crc kubenswrapper[4722]: E0219 19:50:06.072392 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:50:08 crc kubenswrapper[4722]: I0219 19:50:08.773732 4722 generic.go:334] "Generic (PLEG): container finished" podID="1834c92c-87c2-44ab-acda-1170f3a92303" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" exitCode=0 Feb 19 19:50:08 crc kubenswrapper[4722]: I0219 19:50:08.773906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c"} Feb 19 19:50:10 crc kubenswrapper[4722]: E0219 19:50:10.748746 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b45dc0_a6d0_4572_be7f_93dc70be0a17.slice/crio-conmon-ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:50:10 crc kubenswrapper[4722]: I0219 19:50:10.793971 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" exitCode=0 Feb 19 19:50:10 crc kubenswrapper[4722]: I0219 19:50:10.794021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf"} Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.045822 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.061719 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.089472 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.089514 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.095824 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.105379 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.035439 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.048020 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.059969 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.072231 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.082487 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.091884 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.816734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerStarted","Data":"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03"} Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.819954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerStarted","Data":"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c"} Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.846052 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjnjm" podStartSLOduration=3.1951831139999998 podStartE2EDuration="13.846030867s" podCreationTimestamp="2026-02-19 19:49:59 +0000 UTC" firstStartedPulling="2026-02-19 19:50:01.743078179 +0000 UTC m=+1901.355428503" lastFinishedPulling="2026-02-19 19:50:12.393925932 +0000 UTC m=+1912.006276256" observedRunningTime="2026-02-19 19:50:12.839181853 +0000 UTC m=+1912.451532177" watchObservedRunningTime="2026-02-19 19:50:12.846030867 +0000 UTC m=+1912.458381201" Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.866654 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8p2td" podStartSLOduration=3.032853581 podStartE2EDuration="15.866631269s" podCreationTimestamp="2026-02-19 19:49:57 +0000 UTC" firstStartedPulling="2026-02-19 19:49:59.72112044 +0000 UTC m=+1899.333470764" lastFinishedPulling="2026-02-19 19:50:12.554898138 +0000 UTC m=+1912.167248452" observedRunningTime="2026-02-19 19:50:12.855919085 +0000 UTC m=+1912.468269429" watchObservedRunningTime="2026-02-19 19:50:12.866631269 +0000 UTC m=+1912.478981603" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.083742 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" path="/var/lib/kubelet/pods/3f262eb9-64a7-4b10-85f9-4bc43d512f60/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.084432 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" path="/var/lib/kubelet/pods/5b8ebb77-caea-46ca-8989-d2dd37bf2df5/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.085060 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" path="/var/lib/kubelet/pods/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.085709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" path="/var/lib/kubelet/pods/823fc346-84d0-4920-bc42-ec213d0c6eef/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.086967 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84699ef3-8d21-4493-8875-81de167ee617" path="/var/lib/kubelet/pods/84699ef3-8d21-4493-8875-81de167ee617/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.087615 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" path="/var/lib/kubelet/pods/c6e27062-a94f-4d8d-8a07-b940d9aa572e/volumes" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.022804 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.023465 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.071256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.957101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.024065 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.824586 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.824672 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.873800 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.926220 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:20 crc kubenswrapper[4722]: I0219 19:50:20.071507 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:20 crc kubenswrapper[4722]: E0219 19:50:20.071749 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:50:20 crc kubenswrapper[4722]: I0219 19:50:20.707022 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:20 crc kubenswrapper[4722]: I0219 19:50:20.895846 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8p2td" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" containerID="cri-o://71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" gracePeriod=2 Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.459045 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.556023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.556229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.556291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.557202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities" (OuterVolumeSpecName: "utilities") pod "b4b45dc0-a6d0-4572-be7f-93dc70be0a17" (UID: "b4b45dc0-a6d0-4572-be7f-93dc70be0a17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.561611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv" (OuterVolumeSpecName: "kube-api-access-wszsv") pod "b4b45dc0-a6d0-4572-be7f-93dc70be0a17" (UID: "b4b45dc0-a6d0-4572-be7f-93dc70be0a17"). InnerVolumeSpecName "kube-api-access-wszsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.609284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4b45dc0-a6d0-4572-be7f-93dc70be0a17" (UID: "b4b45dc0-a6d0-4572-be7f-93dc70be0a17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.661642 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.661686 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.661703 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.905969 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" exitCode=0 Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906027 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c"} Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"fc6bc749ec8e5faa86281dd9afaa400f15102b634bc38df4e83908220e9f8b43"} Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906102 4722 scope.go:117] "RemoveContainer" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906442 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rjnjm" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" containerID="cri-o://910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" gracePeriod=2 Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.937387 4722 scope.go:117] "RemoveContainer" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.939429 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.952770 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.961801 4722 scope.go:117] "RemoveContainer" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.116414 4722 scope.go:117] "RemoveContainer" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.124396 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c\": container with ID starting with 71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c not found: ID does not exist" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.124467 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c"} err="failed to get container status \"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c\": rpc error: code = NotFound desc = could not find container \"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c\": container with ID starting with 71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.124501 4722 scope.go:117] "RemoveContainer" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.125094 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf\": container with ID starting with ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf not found: ID does not exist" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.125170 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf"} err="failed to get container status \"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf\": rpc error: code = NotFound desc = could not find container \"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf\": container with ID starting with ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.125204 4722 scope.go:117] "RemoveContainer" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.125568 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2\": container with ID starting with 63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2 not found: ID does not exist" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.125598 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2"} err="failed to get container status \"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2\": rpc error: code = NotFound desc = could not find container \"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2\": container with ID starting with 63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2 not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.431112 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.477881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"1834c92c-87c2-44ab-acda-1170f3a92303\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.477937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"1834c92c-87c2-44ab-acda-1170f3a92303\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.478143 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"1834c92c-87c2-44ab-acda-1170f3a92303\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.478942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities" (OuterVolumeSpecName: "utilities") pod "1834c92c-87c2-44ab-acda-1170f3a92303" (UID: "1834c92c-87c2-44ab-acda-1170f3a92303"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.484107 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9" (OuterVolumeSpecName: "kube-api-access-qp6x9") pod "1834c92c-87c2-44ab-acda-1170f3a92303" (UID: "1834c92c-87c2-44ab-acda-1170f3a92303"). InnerVolumeSpecName "kube-api-access-qp6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.502896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1834c92c-87c2-44ab-acda-1170f3a92303" (UID: "1834c92c-87c2-44ab-acda-1170f3a92303"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.581350 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.581390 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.581406 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919395 4722 generic.go:334] "Generic (PLEG): container finished" podID="1834c92c-87c2-44ab-acda-1170f3a92303" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" exitCode=0 Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03"} Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"5994d3f94bd355ac82deb7f5cfe2381af3489b711b13ce6948c137e7c6fdadf4"} Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919925 4722 scope.go:117] "RemoveContainer" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.947224 4722 scope.go:117] "RemoveContainer" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.958138 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.969121 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.970573 4722 scope.go:117] "RemoveContainer" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.995278 4722 scope.go:117] "RemoveContainer" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.995777 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03\": container with ID starting with 910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03 not found: ID does not exist" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.995819 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03"} err="failed to get container status \"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03\": rpc error: code = NotFound desc = could not find container \"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03\": container with ID starting with 910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03 not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.995846 4722 scope.go:117] "RemoveContainer" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.996190 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c\": container with ID starting with 456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c not found: ID does not exist" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.996231 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c"} err="failed to get container status \"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c\": rpc error: code = NotFound desc = could not find container \"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c\": container with ID starting with 456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.996257 4722 scope.go:117] "RemoveContainer" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.996774 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e\": container with ID starting with f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e not found: ID does not exist" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.996801 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e"} err="failed to get container status \"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e\": rpc error: code = NotFound desc = could not find container \"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e\": container with ID starting with f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e not found: ID does not exist" Feb 19 19:50:23 crc kubenswrapper[4722]: I0219 19:50:23.084850 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" path="/var/lib/kubelet/pods/1834c92c-87c2-44ab-acda-1170f3a92303/volumes" Feb 19 19:50:23 crc kubenswrapper[4722]: I0219 19:50:23.085860 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" path="/var/lib/kubelet/pods/b4b45dc0-a6d0-4572-be7f-93dc70be0a17/volumes" Feb 19 19:50:34 crc kubenswrapper[4722]: I0219 19:50:34.071808 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:34 crc kubenswrapper[4722]: E0219 19:50:34.072816 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:50:38 crc kubenswrapper[4722]: I0219 19:50:38.036830 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:50:38 crc kubenswrapper[4722]: I0219 19:50:38.045381 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:50:39 crc kubenswrapper[4722]: I0219 19:50:39.337519 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" path="/var/lib/kubelet/pods/e2859f56-714b-43b5-bb67-6ee5493d4f11/volumes" Feb 19 19:50:40 crc kubenswrapper[4722]: I0219 19:50:40.342578 4722 generic.go:334] "Generic (PLEG): container finished" podID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerID="8ef5eb3908214f96f5d6505146ed47b1529675834981d6c6dfbef8f12a789667" exitCode=0 Feb 19 19:50:40 crc kubenswrapper[4722]: I0219 19:50:40.342654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerDied","Data":"8ef5eb3908214f96f5d6505146ed47b1529675834981d6c6dfbef8f12a789667"} Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.283400 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.428713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.428863 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.430291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.433632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n" (OuterVolumeSpecName: "kube-api-access-kxf9n") pod "fa0d4605-cd87-49b1-b17f-8c0e06590afd" (UID: "fa0d4605-cd87-49b1-b17f-8c0e06590afd"). InnerVolumeSpecName "kube-api-access-kxf9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.454413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory" (OuterVolumeSpecName: "inventory") pod "fa0d4605-cd87-49b1-b17f-8c0e06590afd" (UID: "fa0d4605-cd87-49b1-b17f-8c0e06590afd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.467261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fa0d4605-cd87-49b1-b17f-8c0e06590afd" (UID: "fa0d4605-cd87-49b1-b17f-8c0e06590afd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.533561 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.533592 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.533603 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.765181 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerDied","Data":"bdc4d0bdb826bb0e1954e08f60e022a246a6429f3743c4c5ecdb3bc0104f4b0e"} Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.765230 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc4d0bdb826bb0e1954e08f60e022a246a6429f3743c4c5ecdb3bc0104f4b0e" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.765299 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817260 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t"] Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817659 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817676 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817685 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817696 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817715 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817720 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817736 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817749 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817755 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817772 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817777 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817945 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817957 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817983 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.818635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.820915 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.821107 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.821843 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.822661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.835388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t"] Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.942189 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.942764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.943049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.318027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.318396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.318488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.330081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.332861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.353701 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.441245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:44 crc kubenswrapper[4722]: I0219 19:50:44.013769 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t"] Feb 19 19:50:44 crc kubenswrapper[4722]: I0219 19:50:44.784670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerStarted","Data":"ed094b50ce9e4b6fb2a00e489a57b8c4478f4a3da5d6d489d485b87719753af5"} Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.223670 4722 scope.go:117] "RemoveContainer" containerID="bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.252586 4722 scope.go:117] "RemoveContainer" containerID="0c662d869f0260b21b14e815b1c26ef3d995bd4318e89a8c8d85dd5703eaa89e" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.288966 4722 scope.go:117] "RemoveContainer" containerID="02fdf9891e0a4a5e6a9cd6279f1ac5170d3eaad2e2904682a600a6d410fb2a19" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.337431 4722 scope.go:117] "RemoveContainer" containerID="59b7ab3b9b5c89b55e17c8616e639ea24cc02e1ca89d3d887ff255092c310b2a" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.356624 4722 scope.go:117] "RemoveContainer" containerID="34ce6fe937d88e617e83f04f4163bf9713e6cac4114d5734077d30be33461dbc" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.376874 4722 scope.go:117] "RemoveContainer" containerID="0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.427864 4722 scope.go:117] "RemoveContainer" containerID="a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.795747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerStarted","Data":"c91f7b8f75e6df2eee89ce0b127406ba6281160fc85e04bd9c2b6a2e9c76dae0"} Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.827940 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" podStartSLOduration=3.166611313 podStartE2EDuration="3.827920098s" podCreationTimestamp="2026-02-19 19:50:42 +0000 UTC" firstStartedPulling="2026-02-19 19:50:44.020753796 +0000 UTC m=+1943.633104120" lastFinishedPulling="2026-02-19 19:50:44.682062571 +0000 UTC m=+1944.294412905" observedRunningTime="2026-02-19 19:50:45.814266581 +0000 UTC m=+1945.426616905" watchObservedRunningTime="2026-02-19 19:50:45.827920098 +0000 UTC m=+1945.440270422" Feb 19 19:50:49 crc kubenswrapper[4722]: I0219 19:50:49.071571 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:49 crc kubenswrapper[4722]: E0219 19:50:49.072291 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:51:01 crc kubenswrapper[4722]: I0219 19:51:01.048421 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:51:01 crc kubenswrapper[4722]: I0219 19:51:01.057581 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:51:01 crc kubenswrapper[4722]: I0219 19:51:01.083451 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" path="/var/lib/kubelet/pods/d1a230c6-6844-4483-a8b4-0ae8073dff8d/volumes" Feb 19 19:51:03 crc kubenswrapper[4722]: I0219 19:51:03.072057 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:51:03 crc kubenswrapper[4722]: E0219 19:51:03.072753 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:51:07 crc kubenswrapper[4722]: I0219 19:51:07.032786 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:51:07 crc kubenswrapper[4722]: I0219 19:51:07.048927 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:51:07 crc kubenswrapper[4722]: I0219 19:51:07.087795 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" path="/var/lib/kubelet/pods/106da00f-55de-4b4f-8a57-b8f0b1994c2f/volumes" Feb 19 19:51:15 crc kubenswrapper[4722]: I0219 19:51:15.071119 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:51:16 crc kubenswrapper[4722]: I0219 19:51:16.117830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef"} Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.238442 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.243635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.254614 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.332751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.332982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.333410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.458679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.566826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:22 crc kubenswrapper[4722]: I0219 19:51:22.091298 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:22 crc kubenswrapper[4722]: I0219 19:51:22.193023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerStarted","Data":"99ef2afc5d68e5988babf1bade224e9fb79a3934fe1cd919c9eb1f3329445aa3"} Feb 19 19:51:23 crc kubenswrapper[4722]: I0219 19:51:23.205681 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" exitCode=0 Feb 19 19:51:23 crc kubenswrapper[4722]: I0219 19:51:23.205764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4"} Feb 19 19:51:24 crc kubenswrapper[4722]: I0219 19:51:24.218718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerStarted","Data":"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d"} Feb 19 19:51:28 crc kubenswrapper[4722]: I0219 19:51:28.259362 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerID="c91f7b8f75e6df2eee89ce0b127406ba6281160fc85e04bd9c2b6a2e9c76dae0" exitCode=0 Feb 19 19:51:28 crc kubenswrapper[4722]: I0219 19:51:28.259469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerDied","Data":"c91f7b8f75e6df2eee89ce0b127406ba6281160fc85e04bd9c2b6a2e9c76dae0"} Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.775042 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.805416 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.805483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.805821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.813440 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn" (OuterVolumeSpecName: "kube-api-access-jhmqn") pod "7cf0842e-58ac-4cd1-b26f-9fc131177aa9" (UID: "7cf0842e-58ac-4cd1-b26f-9fc131177aa9"). InnerVolumeSpecName "kube-api-access-jhmqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.835490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7cf0842e-58ac-4cd1-b26f-9fc131177aa9" (UID: "7cf0842e-58ac-4cd1-b26f-9fc131177aa9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.849399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory" (OuterVolumeSpecName: "inventory") pod "7cf0842e-58ac-4cd1-b26f-9fc131177aa9" (UID: "7cf0842e-58ac-4cd1-b26f-9fc131177aa9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.908241 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.908282 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.908295 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.278010 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.278003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerDied","Data":"ed094b50ce9e4b6fb2a00e489a57b8c4478f4a3da5d6d489d485b87719753af5"} Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.278069 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed094b50ce9e4b6fb2a00e489a57b8c4478f4a3da5d6d489d485b87719753af5" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.280074 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" exitCode=0 Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.280108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d"} Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.392488 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rr66z"] Feb 19 19:51:30 crc kubenswrapper[4722]: E0219 19:51:30.392898 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.392922 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.393571 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.394390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.398223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.398271 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.398426 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.400086 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.406817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rr66z"] Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.425329 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.425433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.425552 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.527487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.527536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.527608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.531500 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.532125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.549809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.724783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:31 crc kubenswrapper[4722]: I0219 19:51:31.844839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rr66z"] Feb 19 19:51:32 crc kubenswrapper[4722]: I0219 19:51:32.743793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerStarted","Data":"2df917e43a416f61f0915f579f37cf08041e0c25666ea46224e704144a375c11"} Feb 19 19:51:33 crc kubenswrapper[4722]: I0219 19:51:33.757034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerStarted","Data":"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2"} Feb 19 19:51:33 crc kubenswrapper[4722]: I0219 19:51:33.782817 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p59r9" podStartSLOduration=2.764545821 podStartE2EDuration="12.782796224s" podCreationTimestamp="2026-02-19 19:51:21 +0000 UTC" firstStartedPulling="2026-02-19 19:51:23.207563153 +0000 UTC m=+1982.819913477" lastFinishedPulling="2026-02-19 19:51:33.225813566 +0000 UTC m=+1992.838163880" observedRunningTime="2026-02-19 19:51:33.775629771 +0000 UTC m=+1993.387980115" watchObservedRunningTime="2026-02-19 19:51:33.782796224 +0000 UTC m=+1993.395146548" Feb 19 19:51:34 crc kubenswrapper[4722]: I0219 19:51:34.770743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerStarted","Data":"abe5af0f2de7ed90cae73238503c9bc96c0c732faee4f5f8a93b3f0a5fd43d62"} Feb 19 19:51:34 crc kubenswrapper[4722]: I0219 19:51:34.805608 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" podStartSLOduration=3.10742121 podStartE2EDuration="4.805580834s" podCreationTimestamp="2026-02-19 19:51:30 +0000 UTC" firstStartedPulling="2026-02-19 19:51:31.840989594 +0000 UTC m=+1991.453339918" lastFinishedPulling="2026-02-19 19:51:33.539149218 +0000 UTC m=+1993.151499542" observedRunningTime="2026-02-19 19:51:34.792950939 +0000 UTC m=+1994.405301263" watchObservedRunningTime="2026-02-19 19:51:34.805580834 +0000 UTC m=+1994.417931178" Feb 19 19:51:39 crc kubenswrapper[4722]: I0219 19:51:39.819800 4722 generic.go:334] "Generic (PLEG): container finished" podID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerID="abe5af0f2de7ed90cae73238503c9bc96c0c732faee4f5f8a93b3f0a5fd43d62" exitCode=0 Feb 19 19:51:39 crc kubenswrapper[4722]: I0219 19:51:39.819901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerDied","Data":"abe5af0f2de7ed90cae73238503c9bc96c0c732faee4f5f8a93b3f0a5fd43d62"} Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.347987 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.526888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"812efe23-7ca7-49b9-bd76-194a82c603b3\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.527007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"812efe23-7ca7-49b9-bd76-194a82c603b3\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.527198 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"812efe23-7ca7-49b9-bd76-194a82c603b3\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.533318 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57" (OuterVolumeSpecName: "kube-api-access-6mr57") pod "812efe23-7ca7-49b9-bd76-194a82c603b3" (UID: "812efe23-7ca7-49b9-bd76-194a82c603b3"). InnerVolumeSpecName "kube-api-access-6mr57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.557331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "812efe23-7ca7-49b9-bd76-194a82c603b3" (UID: "812efe23-7ca7-49b9-bd76-194a82c603b3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.558716 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "812efe23-7ca7-49b9-bd76-194a82c603b3" (UID: "812efe23-7ca7-49b9-bd76-194a82c603b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.567070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.567247 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.618852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.629528 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.629567 4722 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.629581 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.843019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerDied","Data":"2df917e43a416f61f0915f579f37cf08041e0c25666ea46224e704144a375c11"} Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.843064 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2df917e43a416f61f0915f579f37cf08041e0c25666ea46224e704144a375c11" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.843033 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.902311 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.916348 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k"] Feb 19 19:51:41 crc kubenswrapper[4722]: E0219 19:51:41.916842 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.916858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.917054 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.917876 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.922983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.923305 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.923454 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.923595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.931290 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k"] Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.982255 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.038426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.038572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.038681 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.140992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.141393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.141555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.145228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.150729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.165240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.254353 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.776767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k"] Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.853189 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerStarted","Data":"48509c603cbc50d92636f3b8fd5508c2be236b4b60cc1033e3fe5f5f28886ca6"} Feb 19 19:51:43 crc kubenswrapper[4722]: I0219 19:51:43.861839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerStarted","Data":"4a87c6050a72a549c80668d7d9b519552efbd08d93bc7244a2c305d766c13317"} Feb 19 19:51:43 crc kubenswrapper[4722]: I0219 19:51:43.862025 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p59r9" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" containerID="cri-o://5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" gracePeriod=2 Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.452840 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.596787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"b2244865-e076-45b5-9bd9-d639d96d6ffe\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.596952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"b2244865-e076-45b5-9bd9-d639d96d6ffe\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.597043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"b2244865-e076-45b5-9bd9-d639d96d6ffe\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.597617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities" (OuterVolumeSpecName: "utilities") pod "b2244865-e076-45b5-9bd9-d639d96d6ffe" (UID: "b2244865-e076-45b5-9bd9-d639d96d6ffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.597985 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.602708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9" (OuterVolumeSpecName: "kube-api-access-r2tt9") pod "b2244865-e076-45b5-9bd9-d639d96d6ffe" (UID: "b2244865-e076-45b5-9bd9-d639d96d6ffe"). InnerVolumeSpecName "kube-api-access-r2tt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.700334 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.720142 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2244865-e076-45b5-9bd9-d639d96d6ffe" (UID: "b2244865-e076-45b5-9bd9-d639d96d6ffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.802432 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875651 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" exitCode=0 Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2"} Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"99ef2afc5d68e5988babf1bade224e9fb79a3934fe1cd919c9eb1f3329445aa3"} Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875905 4722 scope.go:117] "RemoveContainer" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.876054 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.900391 4722 scope.go:117] "RemoveContainer" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.914145 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" podStartSLOduration=3.102753357 podStartE2EDuration="3.914111557s" podCreationTimestamp="2026-02-19 19:51:41 +0000 UTC" firstStartedPulling="2026-02-19 19:51:42.782746889 +0000 UTC m=+2002.395097213" lastFinishedPulling="2026-02-19 19:51:43.594105089 +0000 UTC m=+2003.206455413" observedRunningTime="2026-02-19 19:51:44.910219035 +0000 UTC m=+2004.522569399" watchObservedRunningTime="2026-02-19 19:51:44.914111557 +0000 UTC m=+2004.526461881" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.936459 4722 scope.go:117] "RemoveContainer" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.941548 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.951561 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.978892 4722 scope.go:117] "RemoveContainer" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" Feb 19 19:51:44 crc kubenswrapper[4722]: E0219 19:51:44.979325 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2\": container with ID starting with 5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2 not found: ID does not exist" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979364 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2"} err="failed to get container status \"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2\": rpc error: code = NotFound desc = could not find container \"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2\": container with ID starting with 5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2 not found: ID does not exist" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979392 4722 scope.go:117] "RemoveContainer" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" Feb 19 19:51:44 crc kubenswrapper[4722]: E0219 19:51:44.979831 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d\": container with ID starting with b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d not found: ID does not exist" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979880 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d"} err="failed to get container status \"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d\": rpc error: code = NotFound desc = could not find container \"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d\": container with ID starting with b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d not found: ID does not exist" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979912 4722 scope.go:117] "RemoveContainer" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" Feb 19 19:51:44 crc kubenswrapper[4722]: E0219 19:51:44.980202 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4\": container with ID starting with b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4 not found: ID does not exist" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.980234 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4"} err="failed to get container status \"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4\": rpc error: code = NotFound desc = could not find container \"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4\": container with ID starting with b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4 not found: ID does not exist" Feb 19 19:51:45 crc kubenswrapper[4722]: I0219 19:51:45.083833 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" path="/var/lib/kubelet/pods/b2244865-e076-45b5-9bd9-d639d96d6ffe/volumes" Feb 19 19:51:45 crc kubenswrapper[4722]: I0219 19:51:45.666683 4722 scope.go:117] "RemoveContainer" containerID="60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1" Feb 19 19:51:45 crc kubenswrapper[4722]: I0219 19:51:45.708976 4722 scope.go:117] "RemoveContainer" containerID="b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340" Feb 19 19:51:48 crc kubenswrapper[4722]: I0219 19:51:48.057061 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:51:48 crc kubenswrapper[4722]: I0219 19:51:48.070311 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:51:49 crc kubenswrapper[4722]: I0219 19:51:49.083648 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" path="/var/lib/kubelet/pods/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8/volumes" Feb 19 19:51:50 crc kubenswrapper[4722]: I0219 19:51:50.947388 4722 generic.go:334] "Generic (PLEG): container finished" podID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerID="4a87c6050a72a549c80668d7d9b519552efbd08d93bc7244a2c305d766c13317" exitCode=0 Feb 19 19:51:50 crc kubenswrapper[4722]: I0219 19:51:50.947523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerDied","Data":"4a87c6050a72a549c80668d7d9b519552efbd08d93bc7244a2c305d766c13317"} Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.429271 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.566701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.566831 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.566854 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.572942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z" (OuterVolumeSpecName: "kube-api-access-fwq8z") pod "44ab5cbe-e4cd-4036-8768-104fcf0d8963" (UID: "44ab5cbe-e4cd-4036-8768-104fcf0d8963"). InnerVolumeSpecName "kube-api-access-fwq8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.595658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory" (OuterVolumeSpecName: "inventory") pod "44ab5cbe-e4cd-4036-8768-104fcf0d8963" (UID: "44ab5cbe-e4cd-4036-8768-104fcf0d8963"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.597529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "44ab5cbe-e4cd-4036-8768-104fcf0d8963" (UID: "44ab5cbe-e4cd-4036-8768-104fcf0d8963"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.669648 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.669688 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.669704 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.968781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerDied","Data":"48509c603cbc50d92636f3b8fd5508c2be236b4b60cc1033e3fe5f5f28886ca6"} Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.968834 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48509c603cbc50d92636f3b8fd5508c2be236b4b60cc1033e3fe5f5f28886ca6" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.969134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.044614 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6"] Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.045820 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-content" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.045911 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-content" Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.045967 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046018 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.046100 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-utilities" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046217 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-utilities" Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.046292 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046349 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046650 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.047505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.053386 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6"] Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.089801 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.090031 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.090186 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.090345 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.188783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.188860 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.188922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.291642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.291728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.291797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.295917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.298998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.309365 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.423560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.986576 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6"] Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.989673 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:51:54 crc kubenswrapper[4722]: I0219 19:51:54.993261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerStarted","Data":"143b81501dc786d37c3fad45b6cc39d9b601b99871fd6c5f89b351f716bba996"} Feb 19 19:51:54 crc kubenswrapper[4722]: I0219 19:51:54.993792 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerStarted","Data":"3b4cf662cf6ba5a00d111cabca8e0756fd25592ca41c1262d13ee098f3a88482"} Feb 19 19:52:04 crc kubenswrapper[4722]: I0219 19:52:04.090201 4722 generic.go:334] "Generic (PLEG): container finished" podID="baff33d3-a587-4283-a861-38d88a47539e" containerID="143b81501dc786d37c3fad45b6cc39d9b601b99871fd6c5f89b351f716bba996" exitCode=0 Feb 19 19:52:04 crc kubenswrapper[4722]: I0219 19:52:04.090458 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerDied","Data":"143b81501dc786d37c3fad45b6cc39d9b601b99871fd6c5f89b351f716bba996"} Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.569069 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.660834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"baff33d3-a587-4283-a861-38d88a47539e\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.660933 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"baff33d3-a587-4283-a861-38d88a47539e\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.661201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"baff33d3-a587-4283-a861-38d88a47539e\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.669102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx" (OuterVolumeSpecName: "kube-api-access-m79fx") pod "baff33d3-a587-4283-a861-38d88a47539e" (UID: "baff33d3-a587-4283-a861-38d88a47539e"). InnerVolumeSpecName "kube-api-access-m79fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.692607 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory" (OuterVolumeSpecName: "inventory") pod "baff33d3-a587-4283-a861-38d88a47539e" (UID: "baff33d3-a587-4283-a861-38d88a47539e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.698326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "baff33d3-a587-4283-a861-38d88a47539e" (UID: "baff33d3-a587-4283-a861-38d88a47539e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.763079 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.763118 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.763157 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.114423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerDied","Data":"3b4cf662cf6ba5a00d111cabca8e0756fd25592ca41c1262d13ee098f3a88482"} Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.114460 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b4cf662cf6ba5a00d111cabca8e0756fd25592ca41c1262d13ee098f3a88482" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.114485 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.241935 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6"] Feb 19 19:52:06 crc kubenswrapper[4722]: E0219 19:52:06.242441 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baff33d3-a587-4283-a861-38d88a47539e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.242459 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="baff33d3-a587-4283-a861-38d88a47539e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.242697 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="baff33d3-a587-4283-a861-38d88a47539e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.243507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.255138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6"] Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301241 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301512 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301952 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.302189 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.302439 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406197 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406222 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509138 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.510662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.510731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.514848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.514887 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.515041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.515600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.516212 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.516548 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.516829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.517241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.517909 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.527040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.527957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.530330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.531989 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.535433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.619267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:07 crc kubenswrapper[4722]: I0219 19:52:07.149574 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6"] Feb 19 19:52:08 crc kubenswrapper[4722]: I0219 19:52:08.138369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerStarted","Data":"321fbe59a82c15c45c65820fb8c050c7d091a8fa12ab01541eab805404a4639d"} Feb 19 19:52:08 crc kubenswrapper[4722]: I0219 19:52:08.138951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerStarted","Data":"dc0934b6dce273124dc8075661c982c60a4cfbee12a1f48ca3c5fd27bd9ba327"} Feb 19 19:52:08 crc kubenswrapper[4722]: I0219 19:52:08.165486 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" podStartSLOduration=1.7751663039999999 podStartE2EDuration="2.165465618s" podCreationTimestamp="2026-02-19 19:52:06 +0000 UTC" firstStartedPulling="2026-02-19 19:52:07.163298312 +0000 UTC m=+2026.775648636" lastFinishedPulling="2026-02-19 19:52:07.553597626 +0000 UTC m=+2027.165947950" observedRunningTime="2026-02-19 19:52:08.157446308 +0000 UTC m=+2027.769796642" watchObservedRunningTime="2026-02-19 19:52:08.165465618 +0000 UTC m=+2027.777815942" Feb 19 19:52:33 crc kubenswrapper[4722]: I0219 19:52:33.473139 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:52:33 crc kubenswrapper[4722]: I0219 19:52:33.481084 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:52:35 crc kubenswrapper[4722]: I0219 19:52:35.089269 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" path="/var/lib/kubelet/pods/3eb4da3f-b07b-4b6f-a524-8b2af229ed87/volumes" Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.036017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.046465 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.091581 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" path="/var/lib/kubelet/pods/04e19f64-06d2-4c0e-b33c-000fea5deb27/volumes" Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.499346 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerID="321fbe59a82c15c45c65820fb8c050c7d091a8fa12ab01541eab805404a4639d" exitCode=0 Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.499387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerDied","Data":"321fbe59a82c15c45c65820fb8c050c7d091a8fa12ab01541eab805404a4639d"} Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.010709 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025674 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025759 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025812 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025837 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.026012 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.026049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.026071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.033246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.033318 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.033417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2" (OuterVolumeSpecName: "kube-api-access-8x8x2") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "kube-api-access-8x8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.034341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.038408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039621 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.040285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.041050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.046265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.074706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.094188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory" (OuterVolumeSpecName: "inventory") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129007 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129039 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129049 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129059 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129069 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129079 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129088 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129098 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129111 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129123 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129135 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129167 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129181 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129194 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.546395 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerDied","Data":"dc0934b6dce273124dc8075661c982c60a4cfbee12a1f48ca3c5fd27bd9ba327"} Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.546437 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0934b6dce273124dc8075661c982c60a4cfbee12a1f48ca3c5fd27bd9ba327" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.546559 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.619899 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89"] Feb 19 19:52:43 crc kubenswrapper[4722]: E0219 19:52:43.620496 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.620525 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.620853 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.621837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.625446 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.625748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.625913 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.626143 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.627424 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.636203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89"] Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.644061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.747666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.750300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.752580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.757755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.767099 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.946659 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:44 crc kubenswrapper[4722]: I0219 19:52:44.491370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89"] Feb 19 19:52:44 crc kubenswrapper[4722]: I0219 19:52:44.571980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerStarted","Data":"751f892a57debc49c4d931416540d4a8f9ab6159fbeea6888e820b630b8b4812"} Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.582073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerStarted","Data":"456f8778f9036ba7b994c1e10b0831dd76f418dfe38b1485979e5bcdaf170ca1"} Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.603014 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" podStartSLOduration=2.151882266 podStartE2EDuration="2.602992059s" podCreationTimestamp="2026-02-19 19:52:43 +0000 UTC" firstStartedPulling="2026-02-19 19:52:44.499611043 +0000 UTC m=+2064.111961367" lastFinishedPulling="2026-02-19 19:52:44.950720836 +0000 UTC m=+2064.563071160" observedRunningTime="2026-02-19 19:52:45.59853863 +0000 UTC m=+2065.210888974" watchObservedRunningTime="2026-02-19 19:52:45.602992059 +0000 UTC m=+2065.215342383" Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.814379 4722 scope.go:117] "RemoveContainer" containerID="b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08" Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.848682 4722 scope.go:117] "RemoveContainer" containerID="6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb" Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.894072 4722 scope.go:117] "RemoveContainer" containerID="1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63" Feb 19 19:53:41 crc kubenswrapper[4722]: I0219 19:53:41.798785 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:53:41 crc kubenswrapper[4722]: I0219 19:53:41.800298 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:53:42 crc kubenswrapper[4722]: I0219 19:53:42.092323 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerID="456f8778f9036ba7b994c1e10b0831dd76f418dfe38b1485979e5bcdaf170ca1" exitCode=0 Feb 19 19:53:42 crc kubenswrapper[4722]: I0219 19:53:42.092369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerDied","Data":"456f8778f9036ba7b994c1e10b0831dd76f418dfe38b1485979e5bcdaf170ca1"} Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.622408 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.769579 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.778396 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49" (OuterVolumeSpecName: "kube-api-access-h9h49") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "kube-api-access-h9h49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.804780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.843346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory" (OuterVolumeSpecName: "inventory") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864662 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864694 4722 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864705 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864714 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.899321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.966013 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.110191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerDied","Data":"751f892a57debc49c4d931416540d4a8f9ab6159fbeea6888e820b630b8b4812"} Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.110231 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751f892a57debc49c4d931416540d4a8f9ab6159fbeea6888e820b630b8b4812" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.110281 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.215210 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf"] Feb 19 19:53:44 crc kubenswrapper[4722]: E0219 19:53:44.216076 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.216104 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.216392 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.217396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.225676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.225924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.225757 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.226236 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.226416 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.226426 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.230826 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf"] Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375502 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.477612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478599 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478947 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.482261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.482664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.483069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.483265 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.483391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.496406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.538231 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:45 crc kubenswrapper[4722]: I0219 19:53:45.119566 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf"] Feb 19 19:53:46 crc kubenswrapper[4722]: I0219 19:53:46.135525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerStarted","Data":"42128f3069e7caf833eb9af4ff7adba03d900487ba12c81ba237fa92fcdff17c"} Feb 19 19:53:46 crc kubenswrapper[4722]: I0219 19:53:46.136095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerStarted","Data":"1e8b756a37868b59c74f59e4ae72a0d353d8102de72325a1db36c98c4ee3665a"} Feb 19 19:54:11 crc kubenswrapper[4722]: I0219 19:54:11.798633 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:54:11 crc kubenswrapper[4722]: I0219 19:54:11.799396 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:54:29 crc kubenswrapper[4722]: I0219 19:54:29.497108 4722 generic.go:334] "Generic (PLEG): container finished" podID="ee896205-7724-47fe-9f87-f2efb9afa870" containerID="42128f3069e7caf833eb9af4ff7adba03d900487ba12c81ba237fa92fcdff17c" exitCode=0 Feb 19 19:54:29 crc kubenswrapper[4722]: I0219 19:54:29.497275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerDied","Data":"42128f3069e7caf833eb9af4ff7adba03d900487ba12c81ba237fa92fcdff17c"} Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.110945 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.236615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.245292 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf" (OuterVolumeSpecName: "kube-api-access-kxcnf") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "kube-api-access-kxcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.256345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.271176 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.273369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.273398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.273788 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory" (OuterVolumeSpecName: "inventory") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340798 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340831 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340844 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340855 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340867 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340876 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.520001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerDied","Data":"1e8b756a37868b59c74f59e4ae72a0d353d8102de72325a1db36c98c4ee3665a"} Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.520040 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8b756a37868b59c74f59e4ae72a0d353d8102de72325a1db36c98c4ee3665a" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.520134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.622070 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2"] Feb 19 19:54:31 crc kubenswrapper[4722]: E0219 19:54:31.622763 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee896205-7724-47fe-9f87-f2efb9afa870" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.622841 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee896205-7724-47fe-9f87-f2efb9afa870" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.623249 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee896205-7724-47fe-9f87-f2efb9afa870" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.624662 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630588 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630729 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630926 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.632388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2"] Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748369 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.854794 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.855323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.856482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.857600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.868757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.946254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:32 crc kubenswrapper[4722]: I0219 19:54:32.459123 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2"] Feb 19 19:54:32 crc kubenswrapper[4722]: I0219 19:54:32.529774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerStarted","Data":"17f45e35533bb8a07ee9122a5653857a6db08cf7018f843e7d190f9e046c6b5c"} Feb 19 19:54:33 crc kubenswrapper[4722]: I0219 19:54:33.539852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerStarted","Data":"6e06dc95162c0d51603edd10c6b9f7656cb9b02520ae430117bbe54d6a6625f4"} Feb 19 19:54:33 crc kubenswrapper[4722]: I0219 19:54:33.561937 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" podStartSLOduration=2.167612419 podStartE2EDuration="2.561920541s" podCreationTimestamp="2026-02-19 19:54:31 +0000 UTC" firstStartedPulling="2026-02-19 19:54:32.465602093 +0000 UTC m=+2172.077952417" lastFinishedPulling="2026-02-19 19:54:32.859910215 +0000 UTC m=+2172.472260539" observedRunningTime="2026-02-19 19:54:33.553496388 +0000 UTC m=+2173.165846732" watchObservedRunningTime="2026-02-19 19:54:33.561920541 +0000 UTC m=+2173.174270865" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.798268 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.798847 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.798891 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.799720 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.799774 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef" gracePeriod=600 Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626389 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef" exitCode=0 Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef"} Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626694 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209"} Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626715 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.267742 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.270901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.287806 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.403591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.403727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.403757 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.505630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.505773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.505800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.506229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.506352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.534094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.594526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:25 crc kubenswrapper[4722]: I0219 19:55:25.150488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:26 crc kubenswrapper[4722]: I0219 19:55:26.016880 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b93b205-cb18-4cb4-810c-17775c15279e" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" exitCode=0 Feb 19 19:55:26 crc kubenswrapper[4722]: I0219 19:55:26.016935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6"} Feb 19 19:55:26 crc kubenswrapper[4722]: I0219 19:55:26.017241 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerStarted","Data":"7fd6a711f28aeb9304a059eacf02cde35648f02d7504e83c7695fce054b0c1b6"} Feb 19 19:55:28 crc kubenswrapper[4722]: I0219 19:55:28.036408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerStarted","Data":"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c"} Feb 19 19:55:30 crc kubenswrapper[4722]: I0219 19:55:30.056861 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b93b205-cb18-4cb4-810c-17775c15279e" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" exitCode=0 Feb 19 19:55:30 crc kubenswrapper[4722]: I0219 19:55:30.056943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c"} Feb 19 19:55:31 crc kubenswrapper[4722]: I0219 19:55:31.083072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerStarted","Data":"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf"} Feb 19 19:55:31 crc kubenswrapper[4722]: I0219 19:55:31.102781 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pmrr9" podStartSLOduration=2.69652731 podStartE2EDuration="7.102764957s" podCreationTimestamp="2026-02-19 19:55:24 +0000 UTC" firstStartedPulling="2026-02-19 19:55:26.020446561 +0000 UTC m=+2225.632796895" lastFinishedPulling="2026-02-19 19:55:30.426684218 +0000 UTC m=+2230.039034542" observedRunningTime="2026-02-19 19:55:31.099730732 +0000 UTC m=+2230.712081056" watchObservedRunningTime="2026-02-19 19:55:31.102764957 +0000 UTC m=+2230.715115281" Feb 19 19:55:34 crc kubenswrapper[4722]: I0219 19:55:34.595344 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:34 crc kubenswrapper[4722]: I0219 19:55:34.597133 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:34 crc kubenswrapper[4722]: I0219 19:55:34.648742 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:35 crc kubenswrapper[4722]: I0219 19:55:35.161739 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:36 crc kubenswrapper[4722]: I0219 19:55:36.654715 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.142346 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pmrr9" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" containerID="cri-o://a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" gracePeriod=2 Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.687840 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.733005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"7b93b205-cb18-4cb4-810c-17775c15279e\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.786205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b93b205-cb18-4cb4-810c-17775c15279e" (UID: "7b93b205-cb18-4cb4-810c-17775c15279e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.835668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"7b93b205-cb18-4cb4-810c-17775c15279e\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.836022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"7b93b205-cb18-4cb4-810c-17775c15279e\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.836512 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.836623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities" (OuterVolumeSpecName: "utilities") pod "7b93b205-cb18-4cb4-810c-17775c15279e" (UID: "7b93b205-cb18-4cb4-810c-17775c15279e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.840913 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm" (OuterVolumeSpecName: "kube-api-access-l6kxm") pod "7b93b205-cb18-4cb4-810c-17775c15279e" (UID: "7b93b205-cb18-4cb4-810c-17775c15279e"). InnerVolumeSpecName "kube-api-access-l6kxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.939409 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.939689 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154417 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b93b205-cb18-4cb4-810c-17775c15279e" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" exitCode=0 Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf"} Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"7fd6a711f28aeb9304a059eacf02cde35648f02d7504e83c7695fce054b0c1b6"} Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154900 4722 scope.go:117] "RemoveContainer" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.155073 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.183931 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.184482 4722 scope.go:117] "RemoveContainer" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.197284 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.203285 4722 scope.go:117] "RemoveContainer" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.263791 4722 scope.go:117] "RemoveContainer" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" Feb 19 19:55:39 crc kubenswrapper[4722]: E0219 19:55:39.264355 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf\": container with ID starting with a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf not found: ID does not exist" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264395 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf"} err="failed to get container status \"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf\": rpc error: code = NotFound desc = could not find container \"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf\": container with ID starting with a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf not found: ID does not exist" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264422 4722 scope.go:117] "RemoveContainer" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" Feb 19 19:55:39 crc kubenswrapper[4722]: E0219 19:55:39.264862 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c\": container with ID starting with 36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c not found: ID does not exist" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264884 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c"} err="failed to get container status \"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c\": rpc error: code = NotFound desc = could not find container \"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c\": container with ID starting with 36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c not found: ID does not exist" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264896 4722 scope.go:117] "RemoveContainer" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" Feb 19 19:55:39 crc kubenswrapper[4722]: E0219 19:55:39.265251 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6\": container with ID starting with 26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6 not found: ID does not exist" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.265301 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6"} err="failed to get container status \"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6\": rpc error: code = NotFound desc = could not find container \"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6\": container with ID starting with 26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6 not found: ID does not exist" Feb 19 19:55:41 crc kubenswrapper[4722]: I0219 19:55:41.084587 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" path="/var/lib/kubelet/pods/7b93b205-cb18-4cb4-810c-17775c15279e/volumes" Feb 19 19:57:11 crc kubenswrapper[4722]: I0219 19:57:11.798467 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:57:11 crc kubenswrapper[4722]: I0219 19:57:11.799100 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:41 crc kubenswrapper[4722]: I0219 19:57:41.798670 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:57:41 crc kubenswrapper[4722]: I0219 19:57:41.799236 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:59 crc kubenswrapper[4722]: I0219 19:57:59.479502 4722 generic.go:334] "Generic (PLEG): container finished" podID="a0d75723-6d9a-4609-a294-f179d1e84710" containerID="6e06dc95162c0d51603edd10c6b9f7656cb9b02520ae430117bbe54d6a6625f4" exitCode=0 Feb 19 19:57:59 crc kubenswrapper[4722]: I0219 19:57:59.479626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerDied","Data":"6e06dc95162c0d51603edd10c6b9f7656cb9b02520ae430117bbe54d6a6625f4"} Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.034190 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.206333 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.206758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.206904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.207044 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.207299 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.220352 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.220480 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx" (OuterVolumeSpecName: "kube-api-access-jjpjx") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "kube-api-access-jjpjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.235669 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.238248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.239844 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory" (OuterVolumeSpecName: "inventory") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310656 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310694 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310703 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310712 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310720 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.506223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerDied","Data":"17f45e35533bb8a07ee9122a5653857a6db08cf7018f843e7d190f9e046c6b5c"} Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.506274 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f45e35533bb8a07ee9122a5653857a6db08cf7018f843e7d190f9e046c6b5c" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.506321 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.620532 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2"] Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621002 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621017 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621046 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-content" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621055 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-content" Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621077 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d75723-6d9a-4609-a294-f179d1e84710" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621087 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d75723-6d9a-4609-a294-f179d1e84710" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621105 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-utilities" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621114 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-utilities" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621376 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d75723-6d9a-4609-a294-f179d1e84710" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621403 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.622296 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625340 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625500 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625628 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625746 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625880 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.628354 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.628487 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.636316 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2"] Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725093 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725462 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827373 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827629 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.828677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.831438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.833123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.833595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.834043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.835067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.843569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.955618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:02 crc kubenswrapper[4722]: I0219 19:58:02.492329 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2"] Feb 19 19:58:02 crc kubenswrapper[4722]: I0219 19:58:02.499394 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:58:02 crc kubenswrapper[4722]: I0219 19:58:02.518556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerStarted","Data":"3c98bf09b5168c37261366bfdf36c471ccfc735e44cc5f381dee883dd636c28f"} Feb 19 19:58:03 crc kubenswrapper[4722]: I0219 19:58:03.530873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerStarted","Data":"15f3bf7b4aaeb40fcc3e8b1c6a8270cdc8388a64a0038be83798c738a35d98e7"} Feb 19 19:58:03 crc kubenswrapper[4722]: I0219 19:58:03.555635 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" podStartSLOduration=2.002208127 podStartE2EDuration="2.555614234s" podCreationTimestamp="2026-02-19 19:58:01 +0000 UTC" firstStartedPulling="2026-02-19 19:58:02.499135057 +0000 UTC m=+2382.111485381" lastFinishedPulling="2026-02-19 19:58:03.052541164 +0000 UTC m=+2382.664891488" observedRunningTime="2026-02-19 19:58:03.547764869 +0000 UTC m=+2383.160115193" watchObservedRunningTime="2026-02-19 19:58:03.555614234 +0000 UTC m=+2383.167964558" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.798957 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.799602 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.799655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.800609 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.800682 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" gracePeriod=600 Feb 19 19:58:11 crc kubenswrapper[4722]: E0219 19:58:11.931873 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.650489 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" exitCode=0 Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.650572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209"} Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.650858 4722 scope.go:117] "RemoveContainer" containerID="2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef" Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.651692 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:12 crc kubenswrapper[4722]: E0219 19:58:12.652086 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:25 crc kubenswrapper[4722]: I0219 19:58:25.071261 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:25 crc kubenswrapper[4722]: E0219 19:58:25.072039 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:38 crc kubenswrapper[4722]: I0219 19:58:38.071857 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:38 crc kubenswrapper[4722]: E0219 19:58:38.072769 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:50 crc kubenswrapper[4722]: I0219 19:58:50.071643 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:50 crc kubenswrapper[4722]: E0219 19:58:50.073117 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:05 crc kubenswrapper[4722]: I0219 19:59:05.071544 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:05 crc kubenswrapper[4722]: E0219 19:59:05.073180 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:16 crc kubenswrapper[4722]: I0219 19:59:16.071697 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:16 crc kubenswrapper[4722]: E0219 19:59:16.072534 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:29 crc kubenswrapper[4722]: I0219 19:59:29.071644 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:29 crc kubenswrapper[4722]: E0219 19:59:29.072624 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:44 crc kubenswrapper[4722]: I0219 19:59:44.071718 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:44 crc kubenswrapper[4722]: E0219 19:59:44.072545 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:55 crc kubenswrapper[4722]: I0219 19:59:55.071733 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:55 crc kubenswrapper[4722]: E0219 19:59:55.072428 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.151746 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx"] Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.153848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.159941 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.160112 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.179915 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx"] Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.200202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.200309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.200437 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.301630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.301795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.301890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.302804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.308087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.326716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.478446 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.982131 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx"] Feb 19 20:00:01 crc kubenswrapper[4722]: I0219 20:00:01.662377 4722 generic.go:334] "Generic (PLEG): container finished" podID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerID="893890de942723d2a9f1e2aaf534f954497c2f692f3e27356bd25ee378bc213a" exitCode=0 Feb 19 20:00:01 crc kubenswrapper[4722]: I0219 20:00:01.662583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" event={"ID":"9a47544a-9ba7-49a2-b611-fe1965ebaf42","Type":"ContainerDied","Data":"893890de942723d2a9f1e2aaf534f954497c2f692f3e27356bd25ee378bc213a"} Feb 19 20:00:01 crc kubenswrapper[4722]: I0219 20:00:01.662674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" event={"ID":"9a47544a-9ba7-49a2-b611-fe1965ebaf42","Type":"ContainerStarted","Data":"0468828ef75d8002ff15ef245f1dfd69975f85a3deb54b48830d39542a316e7a"} Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.111537 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.262857 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.262912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.262977 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.264404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a47544a-9ba7-49a2-b611-fe1965ebaf42" (UID: "9a47544a-9ba7-49a2-b611-fe1965ebaf42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.270908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a47544a-9ba7-49a2-b611-fe1965ebaf42" (UID: "9a47544a-9ba7-49a2-b611-fe1965ebaf42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.271085 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55" (OuterVolumeSpecName: "kube-api-access-rvb55") pod "9a47544a-9ba7-49a2-b611-fe1965ebaf42" (UID: "9a47544a-9ba7-49a2-b611-fe1965ebaf42"). InnerVolumeSpecName "kube-api-access-rvb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.366788 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.366826 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.366838 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.686612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" event={"ID":"9a47544a-9ba7-49a2-b611-fe1965ebaf42","Type":"ContainerDied","Data":"0468828ef75d8002ff15ef245f1dfd69975f85a3deb54b48830d39542a316e7a"} Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.686978 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0468828ef75d8002ff15ef245f1dfd69975f85a3deb54b48830d39542a316e7a" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.686732 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:04 crc kubenswrapper[4722]: I0219 20:00:04.198205 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 20:00:04 crc kubenswrapper[4722]: I0219 20:00:04.211286 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 20:00:05 crc kubenswrapper[4722]: I0219 20:00:05.085296 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" path="/var/lib/kubelet/pods/0d5e5981-45e4-4970-bff2-17a6087915e9/volumes" Feb 19 20:00:07 crc kubenswrapper[4722]: I0219 20:00:07.072165 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:07 crc kubenswrapper[4722]: E0219 20:00:07.073434 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:12 crc kubenswrapper[4722]: I0219 20:00:12.771677 4722 generic.go:334] "Generic (PLEG): container finished" podID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerID="15f3bf7b4aaeb40fcc3e8b1c6a8270cdc8388a64a0038be83798c738a35d98e7" exitCode=0 Feb 19 20:00:12 crc kubenswrapper[4722]: I0219 20:00:12.771712 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerDied","Data":"15f3bf7b4aaeb40fcc3e8b1c6a8270cdc8388a64a0038be83798c738a35d98e7"} Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.305573 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402450 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402678 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402818 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402913 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402976 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.403030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.411545 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.437354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7" (OuterVolumeSpecName: "kube-api-access-c2sz7") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "kube-api-access-c2sz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.437659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.439341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.440210 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory" (OuterVolumeSpecName: "inventory") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.442715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.444481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.446265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.448525 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.455742 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.463772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505266 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505307 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505319 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505332 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505345 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505356 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505369 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505380 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505391 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505403 4722 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505414 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.790614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerDied","Data":"3c98bf09b5168c37261366bfdf36c471ccfc735e44cc5f381dee883dd636c28f"} Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.791392 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c98bf09b5168c37261366bfdf36c471ccfc735e44cc5f381dee883dd636c28f" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.790704 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903022 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp"] Feb 19 20:00:14 crc kubenswrapper[4722]: E0219 20:00:14.903531 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerName="collect-profiles" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903553 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerName="collect-profiles" Feb 19 20:00:14 crc kubenswrapper[4722]: E0219 20:00:14.903580 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903588 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903833 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.904057 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerName="collect-profiles" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.904934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.910975 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911262 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911445 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911711 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.918447 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp"] Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014709 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.015049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117206 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.130317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.131728 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.132299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.132817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.137192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.137737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.197922 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.229471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: W0219 20:00:15.822421 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2c74da_6ac0_4070_9f5a_577bc5c64771.slice/crio-02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769 WatchSource:0}: Error finding container 02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769: Status 404 returned error can't find the container with id 02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769 Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.831522 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp"] Feb 19 20:00:16 crc kubenswrapper[4722]: I0219 20:00:16.818856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerStarted","Data":"10a106f9d091403365a306069c9198e890b02573d70dd878e404c55981b55a77"} Feb 19 20:00:16 crc kubenswrapper[4722]: I0219 20:00:16.819469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerStarted","Data":"02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769"} Feb 19 20:00:16 crc kubenswrapper[4722]: I0219 20:00:16.845953 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" podStartSLOduration=2.433483937 podStartE2EDuration="2.845936556s" podCreationTimestamp="2026-02-19 20:00:14 +0000 UTC" firstStartedPulling="2026-02-19 20:00:15.824382008 +0000 UTC m=+2515.436732332" lastFinishedPulling="2026-02-19 20:00:16.236834627 +0000 UTC m=+2515.849184951" observedRunningTime="2026-02-19 20:00:16.84545164 +0000 UTC m=+2516.457801994" watchObservedRunningTime="2026-02-19 20:00:16.845936556 +0000 UTC m=+2516.458286880" Feb 19 20:00:22 crc kubenswrapper[4722]: I0219 20:00:22.071844 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:22 crc kubenswrapper[4722]: E0219 20:00:22.072677 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:37 crc kubenswrapper[4722]: I0219 20:00:37.071726 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:37 crc kubenswrapper[4722]: E0219 20:00:37.072630 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.295017 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.298449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.331905 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.415003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.415205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.415242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517133 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.542917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.632617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:44 crc kubenswrapper[4722]: I0219 20:00:44.177869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:45 crc kubenswrapper[4722]: I0219 20:00:45.132911 4722 generic.go:334] "Generic (PLEG): container finished" podID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" exitCode=0 Feb 19 20:00:45 crc kubenswrapper[4722]: I0219 20:00:45.132967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c"} Feb 19 20:00:45 crc kubenswrapper[4722]: I0219 20:00:45.133289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerStarted","Data":"182cc0081f8daaf558c0e18ef713d4c999f04aa48ebdb22e2a458e3216bb9ceb"} Feb 19 20:00:46 crc kubenswrapper[4722]: I0219 20:00:46.174953 4722 scope.go:117] "RemoveContainer" containerID="2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b" Feb 19 20:00:47 crc kubenswrapper[4722]: I0219 20:00:47.154475 4722 generic.go:334] "Generic (PLEG): container finished" podID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" exitCode=0 Feb 19 20:00:47 crc kubenswrapper[4722]: I0219 20:00:47.154563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec"} Feb 19 20:00:48 crc kubenswrapper[4722]: I0219 20:00:48.167240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerStarted","Data":"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833"} Feb 19 20:00:48 crc kubenswrapper[4722]: I0219 20:00:48.194977 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hgj8p" podStartSLOduration=2.756886337 podStartE2EDuration="5.194953986s" podCreationTimestamp="2026-02-19 20:00:43 +0000 UTC" firstStartedPulling="2026-02-19 20:00:45.135232747 +0000 UTC m=+2544.747583071" lastFinishedPulling="2026-02-19 20:00:47.573300396 +0000 UTC m=+2547.185650720" observedRunningTime="2026-02-19 20:00:48.185682758 +0000 UTC m=+2547.798033092" watchObservedRunningTime="2026-02-19 20:00:48.194953986 +0000 UTC m=+2547.807304310" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.077287 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:51 crc kubenswrapper[4722]: E0219 20:00:51.078061 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.705648 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.708450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.720385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.786633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.786791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.786966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889146 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889438 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.914038 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:52 crc kubenswrapper[4722]: I0219 20:00:52.045195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:52 crc kubenswrapper[4722]: I0219 20:00:52.561572 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.218316 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" exitCode=0 Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.218366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce"} Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.218572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerStarted","Data":"c0e49dde2201530e923ca42ac66ad72cb85c66f2f04363fd7cf12ad95275cc23"} Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.633699 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.633991 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.691830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:54 crc kubenswrapper[4722]: I0219 20:00:54.233763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerStarted","Data":"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed"} Feb 19 20:00:54 crc kubenswrapper[4722]: I0219 20:00:54.284641 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.081375 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.255493 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" exitCode=0 Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.255573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed"} Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.255740 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hgj8p" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" containerID="cri-o://8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" gracePeriod=2 Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.871842 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.906711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.906784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.906893 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.908511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities" (OuterVolumeSpecName: "utilities") pod "a5bc0218-2703-4ebb-86a4-8bbcffe69121" (UID: "a5bc0218-2703-4ebb-86a4-8bbcffe69121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.929329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5bc0218-2703-4ebb-86a4-8bbcffe69121" (UID: "a5bc0218-2703-4ebb-86a4-8bbcffe69121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.938184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs" (OuterVolumeSpecName: "kube-api-access-w2xxs") pod "a5bc0218-2703-4ebb-86a4-8bbcffe69121" (UID: "a5bc0218-2703-4ebb-86a4-8bbcffe69121"). InnerVolumeSpecName "kube-api-access-w2xxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.008609 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.008640 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.008649 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.266926 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.266995 4722 generic.go:334] "Generic (PLEG): container finished" podID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" exitCode=0 Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.267056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833"} Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.267102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"182cc0081f8daaf558c0e18ef713d4c999f04aa48ebdb22e2a458e3216bb9ceb"} Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.267125 4722 scope.go:117] "RemoveContainer" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.270717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerStarted","Data":"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134"} Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.292078 4722 scope.go:117] "RemoveContainer" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.307514 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.316835 4722 scope.go:117] "RemoveContainer" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.324544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.329232 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5nsl" podStartSLOduration=2.818583787 podStartE2EDuration="6.329205229s" podCreationTimestamp="2026-02-19 20:00:51 +0000 UTC" firstStartedPulling="2026-02-19 20:00:53.220178829 +0000 UTC m=+2552.832529143" lastFinishedPulling="2026-02-19 20:00:56.730800251 +0000 UTC m=+2556.343150585" observedRunningTime="2026-02-19 20:00:57.307483223 +0000 UTC m=+2556.919833547" watchObservedRunningTime="2026-02-19 20:00:57.329205229 +0000 UTC m=+2556.941555563" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.345032 4722 scope.go:117] "RemoveContainer" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" Feb 19 20:00:57 crc kubenswrapper[4722]: E0219 20:00:57.345735 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833\": container with ID starting with 8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833 not found: ID does not exist" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.345848 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833"} err="failed to get container status \"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833\": rpc error: code = NotFound desc = could not find container \"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833\": container with ID starting with 8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833 not found: ID does not exist" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.345904 4722 scope.go:117] "RemoveContainer" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" Feb 19 20:00:57 crc kubenswrapper[4722]: E0219 20:00:57.346585 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec\": container with ID starting with 6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec not found: ID does not exist" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.346636 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec"} err="failed to get container status \"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec\": rpc error: code = NotFound desc = could not find container \"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec\": container with ID starting with 6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec not found: ID does not exist" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.346659 4722 scope.go:117] "RemoveContainer" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" Feb 19 20:00:57 crc kubenswrapper[4722]: E0219 20:00:57.347013 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c\": container with ID starting with 87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c not found: ID does not exist" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.347048 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c"} err="failed to get container status \"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c\": rpc error: code = NotFound desc = could not find container \"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c\": container with ID starting with 87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c not found: ID does not exist" Feb 19 20:00:59 crc kubenswrapper[4722]: I0219 20:00:59.081655 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" path="/var/lib/kubelet/pods/a5bc0218-2703-4ebb-86a4-8bbcffe69121/volumes" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.167867 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525521-9cbvt"] Feb 19 20:01:00 crc kubenswrapper[4722]: E0219 20:01:00.168534 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168558 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" Feb 19 20:01:00 crc kubenswrapper[4722]: E0219 20:01:00.168600 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-utilities" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168611 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-utilities" Feb 19 20:01:00 crc kubenswrapper[4722]: E0219 20:01:00.168641 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-content" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168653 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-content" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168995 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.174596 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.191911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525521-9cbvt"] Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397327 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397573 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.403776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.417134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.417200 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.420893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.497228 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.941760 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525521-9cbvt"] Feb 19 20:01:01 crc kubenswrapper[4722]: I0219 20:01:01.315921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerStarted","Data":"342f08efdd598eafc0f8ea31b816d469b95fae87bad643fc4584f90d98ad449d"} Feb 19 20:01:01 crc kubenswrapper[4722]: I0219 20:01:01.315970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerStarted","Data":"2ecac706e54aad4a3908fec9e0940aa1fccbea84739f7ef2dc757b6179a2f248"} Feb 19 20:01:01 crc kubenswrapper[4722]: I0219 20:01:01.336006 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525521-9cbvt" podStartSLOduration=1.335988681 podStartE2EDuration="1.335988681s" podCreationTimestamp="2026-02-19 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:01:01.33403185 +0000 UTC m=+2560.946382174" watchObservedRunningTime="2026-02-19 20:01:01.335988681 +0000 UTC m=+2560.948338995" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.047332 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.048375 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.134125 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.379989 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.437598 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.070914 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:04 crc kubenswrapper[4722]: E0219 20:01:04.071454 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.350213 4722 generic.go:334] "Generic (PLEG): container finished" podID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerID="342f08efdd598eafc0f8ea31b816d469b95fae87bad643fc4584f90d98ad449d" exitCode=0 Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.350285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerDied","Data":"342f08efdd598eafc0f8ea31b816d469b95fae87bad643fc4584f90d98ad449d"} Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.350426 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5nsl" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" containerID="cri-o://150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" gracePeriod=2 Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.911641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.989353 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"6f75f82e-3471-4ff7-92e0-d758f55f5394\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.989554 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"6f75f82e-3471-4ff7-92e0-d758f55f5394\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.989691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"6f75f82e-3471-4ff7-92e0-d758f55f5394\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.991043 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities" (OuterVolumeSpecName: "utilities") pod "6f75f82e-3471-4ff7-92e0-d758f55f5394" (UID: "6f75f82e-3471-4ff7-92e0-d758f55f5394"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.996431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h" (OuterVolumeSpecName: "kube-api-access-2jt5h") pod "6f75f82e-3471-4ff7-92e0-d758f55f5394" (UID: "6f75f82e-3471-4ff7-92e0-d758f55f5394"). InnerVolumeSpecName "kube-api-access-2jt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.041989 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f75f82e-3471-4ff7-92e0-d758f55f5394" (UID: "6f75f82e-3471-4ff7-92e0-d758f55f5394"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.092849 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.093872 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.093948 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.360945 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" exitCode=0 Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361021 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134"} Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361439 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"c0e49dde2201530e923ca42ac66ad72cb85c66f2f04363fd7cf12ad95275cc23"} Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361475 4722 scope.go:117] "RemoveContainer" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.391991 4722 scope.go:117] "RemoveContainer" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.392029 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.402844 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.410344 4722 scope.go:117] "RemoveContainer" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.463563 4722 scope.go:117] "RemoveContainer" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" Feb 19 20:01:05 crc kubenswrapper[4722]: E0219 20:01:05.466806 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134\": container with ID starting with 150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134 not found: ID does not exist" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.466885 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134"} err="failed to get container status \"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134\": rpc error: code = NotFound desc = could not find container \"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134\": container with ID starting with 150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134 not found: ID does not exist" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.466926 4722 scope.go:117] "RemoveContainer" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" Feb 19 20:01:05 crc kubenswrapper[4722]: E0219 20:01:05.467279 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed\": container with ID starting with 3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed not found: ID does not exist" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.467323 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed"} err="failed to get container status \"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed\": rpc error: code = NotFound desc = could not find container \"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed\": container with ID starting with 3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed not found: ID does not exist" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.467349 4722 scope.go:117] "RemoveContainer" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" Feb 19 20:01:05 crc kubenswrapper[4722]: E0219 20:01:05.467626 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce\": container with ID starting with cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce not found: ID does not exist" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.467671 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce"} err="failed to get container status \"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce\": rpc error: code = NotFound desc = could not find container \"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce\": container with ID starting with cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce not found: ID does not exist" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.799971 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805447 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.811972 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.812033 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z" (OuterVolumeSpecName: "kube-api-access-nqw5z") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "kube-api-access-nqw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.853456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.865289 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data" (OuterVolumeSpecName: "config-data") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907239 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907270 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907282 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907291 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:06 crc kubenswrapper[4722]: I0219 20:01:06.385933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerDied","Data":"2ecac706e54aad4a3908fec9e0940aa1fccbea84739f7ef2dc757b6179a2f248"} Feb 19 20:01:06 crc kubenswrapper[4722]: I0219 20:01:06.385983 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ecac706e54aad4a3908fec9e0940aa1fccbea84739f7ef2dc757b6179a2f248" Feb 19 20:01:06 crc kubenswrapper[4722]: I0219 20:01:06.386089 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:07 crc kubenswrapper[4722]: I0219 20:01:07.083425 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" path="/var/lib/kubelet/pods/6f75f82e-3471-4ff7-92e0-d758f55f5394/volumes" Feb 19 20:01:16 crc kubenswrapper[4722]: I0219 20:01:16.071777 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:16 crc kubenswrapper[4722]: E0219 20:01:16.072902 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:27 crc kubenswrapper[4722]: I0219 20:01:27.071763 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:27 crc kubenswrapper[4722]: E0219 20:01:27.072604 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.769387 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.771969 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-utilities" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.771989 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-utilities" Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.772001 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772008 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.772053 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerName="keystone-cron" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772062 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerName="keystone-cron" Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.772094 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-content" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772100 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-content" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772436 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772472 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerName="keystone-cron" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.787290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.802369 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.857621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.858028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.858230 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.959956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960131 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.993123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:29 crc kubenswrapper[4722]: I0219 20:01:29.114135 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:29 crc kubenswrapper[4722]: I0219 20:01:29.692934 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:30 crc kubenswrapper[4722]: I0219 20:01:30.651607 4722 generic.go:334] "Generic (PLEG): container finished" podID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" exitCode=0 Feb 19 20:01:30 crc kubenswrapper[4722]: I0219 20:01:30.651683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9"} Feb 19 20:01:30 crc kubenswrapper[4722]: I0219 20:01:30.651928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerStarted","Data":"0ae6aa45c815f48f8f0269466e13089e0f93eaf09184b6a3ba49dc588c51d376"} Feb 19 20:01:31 crc kubenswrapper[4722]: I0219 20:01:31.665470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerStarted","Data":"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff"} Feb 19 20:01:35 crc kubenswrapper[4722]: I0219 20:01:35.707485 4722 generic.go:334] "Generic (PLEG): container finished" podID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" exitCode=0 Feb 19 20:01:35 crc kubenswrapper[4722]: I0219 20:01:35.707560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff"} Feb 19 20:01:36 crc kubenswrapper[4722]: I0219 20:01:36.720868 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerStarted","Data":"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0"} Feb 19 20:01:36 crc kubenswrapper[4722]: I0219 20:01:36.747583 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2rkv" podStartSLOduration=3.219179615 podStartE2EDuration="8.747564606s" podCreationTimestamp="2026-02-19 20:01:28 +0000 UTC" firstStartedPulling="2026-02-19 20:01:30.653422492 +0000 UTC m=+2590.265772816" lastFinishedPulling="2026-02-19 20:01:36.181807483 +0000 UTC m=+2595.794157807" observedRunningTime="2026-02-19 20:01:36.738461583 +0000 UTC m=+2596.350811907" watchObservedRunningTime="2026-02-19 20:01:36.747564606 +0000 UTC m=+2596.359914930" Feb 19 20:01:39 crc kubenswrapper[4722]: I0219 20:01:39.071842 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:39 crc kubenswrapper[4722]: E0219 20:01:39.072501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:39 crc kubenswrapper[4722]: I0219 20:01:39.114472 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:39 crc kubenswrapper[4722]: I0219 20:01:39.114512 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:40 crc kubenswrapper[4722]: I0219 20:01:40.172901 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n2rkv" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" probeResult="failure" output=< Feb 19 20:01:40 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 20:01:40 crc kubenswrapper[4722]: > Feb 19 20:01:49 crc kubenswrapper[4722]: I0219 20:01:49.194475 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:49 crc kubenswrapper[4722]: I0219 20:01:49.270078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:49 crc kubenswrapper[4722]: I0219 20:01:49.444326 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:50 crc kubenswrapper[4722]: I0219 20:01:50.843209 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2rkv" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" containerID="cri-o://9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" gracePeriod=2 Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.072319 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.072937 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.403538 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.523334 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.523722 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.523885 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.525534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities" (OuterVolumeSpecName: "utilities") pod "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" (UID: "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.530211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc" (OuterVolumeSpecName: "kube-api-access-qmhbc") pod "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" (UID: "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0"). InnerVolumeSpecName "kube-api-access-qmhbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.626187 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.626222 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.655893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" (UID: "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.728928 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856118 4722 generic.go:334] "Generic (PLEG): container finished" podID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" exitCode=0 Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856218 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0"} Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856277 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"0ae6aa45c815f48f8f0269466e13089e0f93eaf09184b6a3ba49dc588c51d376"} Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856328 4722 scope.go:117] "RemoveContainer" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.880064 4722 scope.go:117] "RemoveContainer" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.901697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.914893 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.920833 4722 scope.go:117] "RemoveContainer" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.979131 4722 scope.go:117] "RemoveContainer" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.979533 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0\": container with ID starting with 9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0 not found: ID does not exist" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.979560 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0"} err="failed to get container status \"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0\": rpc error: code = NotFound desc = could not find container \"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0\": container with ID starting with 9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0 not found: ID does not exist" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.979633 4722 scope.go:117] "RemoveContainer" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.980109 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff\": container with ID starting with 94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff not found: ID does not exist" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.980130 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff"} err="failed to get container status \"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff\": rpc error: code = NotFound desc = could not find container \"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff\": container with ID starting with 94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff not found: ID does not exist" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.980142 4722 scope.go:117] "RemoveContainer" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.980381 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9\": container with ID starting with 7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9 not found: ID does not exist" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.980400 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9"} err="failed to get container status \"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9\": rpc error: code = NotFound desc = could not find container \"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9\": container with ID starting with 7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9 not found: ID does not exist" Feb 19 20:01:53 crc kubenswrapper[4722]: I0219 20:01:53.096257 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" path="/var/lib/kubelet/pods/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0/volumes" Feb 19 20:02:05 crc kubenswrapper[4722]: I0219 20:02:05.072074 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:05 crc kubenswrapper[4722]: E0219 20:02:05.072886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:17 crc kubenswrapper[4722]: I0219 20:02:17.071406 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:17 crc kubenswrapper[4722]: E0219 20:02:17.073531 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:23 crc kubenswrapper[4722]: I0219 20:02:23.141783 4722 generic.go:334] "Generic (PLEG): container finished" podID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerID="10a106f9d091403365a306069c9198e890b02573d70dd878e404c55981b55a77" exitCode=0 Feb 19 20:02:23 crc kubenswrapper[4722]: I0219 20:02:23.141913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerDied","Data":"10a106f9d091403365a306069c9198e890b02573d70dd878e404c55981b55a77"} Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.624274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821473 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821831 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.827317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.828617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk" (OuterVolumeSpecName: "kube-api-access-q8jjk") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "kube-api-access-q8jjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.855933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory" (OuterVolumeSpecName: "inventory") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.857911 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.858668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.858840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.860943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925356 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925407 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925437 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925449 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925461 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925474 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:25 crc kubenswrapper[4722]: I0219 20:02:25.163304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerDied","Data":"02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769"} Feb 19 20:02:25 crc kubenswrapper[4722]: I0219 20:02:25.163355 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769" Feb 19 20:02:25 crc kubenswrapper[4722]: I0219 20:02:25.163375 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:02:31 crc kubenswrapper[4722]: I0219 20:02:31.077931 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:31 crc kubenswrapper[4722]: E0219 20:02:31.078760 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:45 crc kubenswrapper[4722]: I0219 20:02:45.071811 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:45 crc kubenswrapper[4722]: E0219 20:02:45.073362 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:57 crc kubenswrapper[4722]: I0219 20:02:57.072087 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:57 crc kubenswrapper[4722]: E0219 20:02:57.073619 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:03:09 crc kubenswrapper[4722]: I0219 20:03:09.072200 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:03:09 crc kubenswrapper[4722]: E0219 20:03:09.072904 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:03:24 crc kubenswrapper[4722]: I0219 20:03:24.071401 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:03:24 crc kubenswrapper[4722]: I0219 20:03:24.715098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9"} Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.788217 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.788983 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-utilities" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.788995 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-utilities" Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.789012 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789019 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.789033 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-content" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789039 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-content" Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.789067 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789074 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789258 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789280 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.790346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.800288 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rtsd9"/"openshift-service-ca.crt" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.800557 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rtsd9"/"kube-root-ca.crt" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.812753 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.899490 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.899604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.001617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.001985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.002470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.026768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.111300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.853546 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.856553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:03:32 crc kubenswrapper[4722]: I0219 20:03:32.810168 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerStarted","Data":"d066611f93dcd72357fa176895ee015380e463febd20c32f8384d090af7a454a"} Feb 19 20:03:40 crc kubenswrapper[4722]: I0219 20:03:40.913428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerStarted","Data":"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073"} Feb 19 20:03:40 crc kubenswrapper[4722]: I0219 20:03:40.913969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerStarted","Data":"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103"} Feb 19 20:03:40 crc kubenswrapper[4722]: I0219 20:03:40.947410 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rtsd9/must-gather-h964l" podStartSLOduration=3.134192971 podStartE2EDuration="10.947386588s" podCreationTimestamp="2026-02-19 20:03:30 +0000 UTC" firstStartedPulling="2026-02-19 20:03:31.853500299 +0000 UTC m=+2711.465850623" lastFinishedPulling="2026-02-19 20:03:39.666693896 +0000 UTC m=+2719.279044240" observedRunningTime="2026-02-19 20:03:40.932532906 +0000 UTC m=+2720.544883270" watchObservedRunningTime="2026-02-19 20:03:40.947386588 +0000 UTC m=+2720.559736912" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.579421 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-tbt92"] Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.582686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.584642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rtsd9"/"default-dockercfg-k8bvg" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.660108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.660382 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.762666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.762847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.762948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.790485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.904459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.957842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" event={"ID":"5c02eb8b-79a9-47f3-823d-6919493345f2","Type":"ContainerStarted","Data":"44c6450afcb051e28021497062327988071c4a584fd190fe0c5ef8dba43c97e3"} Feb 19 20:03:57 crc kubenswrapper[4722]: I0219 20:03:57.122057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" event={"ID":"5c02eb8b-79a9-47f3-823d-6919493345f2","Type":"ContainerStarted","Data":"9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef"} Feb 19 20:03:57 crc kubenswrapper[4722]: I0219 20:03:57.145246 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" podStartSLOduration=1.351131539 podStartE2EDuration="14.145216082s" podCreationTimestamp="2026-02-19 20:03:43 +0000 UTC" firstStartedPulling="2026-02-19 20:03:43.937434522 +0000 UTC m=+2723.549784846" lastFinishedPulling="2026-02-19 20:03:56.731519075 +0000 UTC m=+2736.343869389" observedRunningTime="2026-02-19 20:03:57.141068132 +0000 UTC m=+2736.753418456" watchObservedRunningTime="2026-02-19 20:03:57.145216082 +0000 UTC m=+2736.757566406" Feb 19 20:04:14 crc kubenswrapper[4722]: I0219 20:04:14.279616 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerID="9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef" exitCode=0 Feb 19 20:04:14 crc kubenswrapper[4722]: I0219 20:04:14.279702 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" event={"ID":"5c02eb8b-79a9-47f3-823d-6919493345f2","Type":"ContainerDied","Data":"9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef"} Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.420079 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.495620 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-tbt92"] Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.511260 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-tbt92"] Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530281 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"5c02eb8b-79a9-47f3-823d-6919493345f2\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"5c02eb8b-79a9-47f3-823d-6919493345f2\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530526 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host" (OuterVolumeSpecName: "host") pod "5c02eb8b-79a9-47f3-823d-6919493345f2" (UID: "5c02eb8b-79a9-47f3-823d-6919493345f2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530986 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.537377 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn" (OuterVolumeSpecName: "kube-api-access-z2dsn") pod "5c02eb8b-79a9-47f3-823d-6919493345f2" (UID: "5c02eb8b-79a9-47f3-823d-6919493345f2"). InnerVolumeSpecName "kube-api-access-z2dsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.632682 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.299034 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c6450afcb051e28021497062327988071c4a584fd190fe0c5ef8dba43c97e3" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.299112 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.689761 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-lxgjq"] Feb 19 20:04:16 crc kubenswrapper[4722]: E0219 20:04:16.690244 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerName="container-00" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.690260 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerName="container-00" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.690436 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerName="container-00" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.691208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.693509 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rtsd9"/"default-dockercfg-k8bvg" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.755546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.755885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.857444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.857622 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.857912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.879421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.013799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.082974 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" path="/var/lib/kubelet/pods/5c02eb8b-79a9-47f3-823d-6919493345f2/volumes" Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.309949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" event={"ID":"ed8cf44d-7e13-4277-a920-9fb05b46572a","Type":"ContainerStarted","Data":"68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c"} Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.309990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" event={"ID":"ed8cf44d-7e13-4277-a920-9fb05b46572a","Type":"ContainerStarted","Data":"db660097628c3c55eab2e5d6d408b1ff91c08194bf5ea622ef5b44f7227d6b12"} Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.328830 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" podStartSLOduration=1.328816265 podStartE2EDuration="1.328816265s" podCreationTimestamp="2026-02-19 20:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:04:17.328448714 +0000 UTC m=+2756.940799038" watchObservedRunningTime="2026-02-19 20:04:17.328816265 +0000 UTC m=+2756.941166589" Feb 19 20:04:18 crc kubenswrapper[4722]: I0219 20:04:18.320514 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerID="68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c" exitCode=1 Feb 19 20:04:18 crc kubenswrapper[4722]: I0219 20:04:18.320540 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" event={"ID":"ed8cf44d-7e13-4277-a920-9fb05b46572a","Type":"ContainerDied","Data":"68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c"} Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.447267 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.479716 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-lxgjq"] Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.492263 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-lxgjq"] Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.518437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"ed8cf44d-7e13-4277-a920-9fb05b46572a\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.518538 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"ed8cf44d-7e13-4277-a920-9fb05b46572a\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.518671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host" (OuterVolumeSpecName: "host") pod "ed8cf44d-7e13-4277-a920-9fb05b46572a" (UID: "ed8cf44d-7e13-4277-a920-9fb05b46572a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.519327 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.523836 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk" (OuterVolumeSpecName: "kube-api-access-w72qk") pod "ed8cf44d-7e13-4277-a920-9fb05b46572a" (UID: "ed8cf44d-7e13-4277-a920-9fb05b46572a"). InnerVolumeSpecName "kube-api-access-w72qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.622518 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:20 crc kubenswrapper[4722]: I0219 20:04:20.338887 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db660097628c3c55eab2e5d6d408b1ff91c08194bf5ea622ef5b44f7227d6b12" Feb 19 20:04:20 crc kubenswrapper[4722]: I0219 20:04:20.339023 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:21 crc kubenswrapper[4722]: I0219 20:04:21.087381 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" path="/var/lib/kubelet/pods/ed8cf44d-7e13-4277-a920-9fb05b46572a/volumes" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.006534 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/init-config-reloader/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.214716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/config-reloader/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.239861 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/init-config-reloader/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.266604 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/alertmanager/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.574750 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-546c4d4684-6vk7j_a7701b23-dddb-4a45-8982-11ab69bc30b1/barbican-api/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.681496 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-546c4d4684-6vk7j_a7701b23-dddb-4a45-8982-11ab69bc30b1/barbican-api-log/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.718287 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-98b54b474-9tfhf_96ffdf9d-f932-419b-be31-9f38358d2db5/barbican-keystone-listener/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.799937 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-98b54b474-9tfhf_96ffdf9d-f932-419b-be31-9f38358d2db5/barbican-keystone-listener-log/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.977788 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6767bd5ccf-ggbrg_66f5042d-2b30-4ac4-8594-cfc0f9590460/barbican-worker/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.992990 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6767bd5ccf-ggbrg_66f5042d-2b30-4ac4-8594-cfc0f9590460/barbican-worker-log/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.189104 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4_7573aaf8-263a-4e50-84da-58cf311829a9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.293529 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/ceilometer-central-agent/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.337783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/ceilometer-notification-agent/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.412387 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/proxy-httpd/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.460524 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/sg-core/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.638397 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8c8e6512-8007-4e99-8589-8dccb1975e3f/cinder-api/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.650198 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8c8e6512-8007-4e99-8589-8dccb1975e3f/cinder-api-log/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.879848 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_afcc30d0-b94c-4bf7-8736-fb35bc461fa2/cinder-scheduler/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.914499 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_afcc30d0-b94c-4bf7-8736-fb35bc461fa2/probe/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.019747 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_fc650d44-069f-41ed-b944-f1168dd5b25c/cloudkitty-api/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.078796 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_fc650d44-069f-41ed-b944-f1168dd5b25c/cloudkitty-api-log/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.178787 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_53bc8f19-43b1-4297-a3db-986381793b6e/loki-compactor/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.358500 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-2j29g_47cbe0b4-7d45-486b-9e9b-964db524e7ab/gateway/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.377988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-llw6c_aba36975-65f4-4f71-a709-261d2b9255ea/loki-distributor/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.550254 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-qxjk2_fc37f35d-ac2f-40a0-90e1-40c3b80b1782/gateway/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.695753 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_15869f30-52a4-4db0-aca8-53c5b319f7a1/loki-index-gateway/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.788825 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_a3fc19f1-6f9f-4f35-a391-1f6743480bd3/loki-ingester/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.915522 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-k6gcm_cad6276e-0607-49e0-8a90-a11e9b916991/loki-querier/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.051742 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8_9babbc99-4133-47c1-85e5-95039351727b/loki-query-frontend/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.360783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7m66x_7a9a8806-dadf-4cd5-af24-fc35c7e52197/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.595011 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t_7cf0842e-58ac-4cd1-b26f-9fc131177aa9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.976539 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-8zj5g_f6d970a0-c801-4472-a3b6-eccd8335d0a8/init/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.170322 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-8zj5g_f6d970a0-c801-4472-a3b6-eccd8335d0a8/init/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.274120 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-8zj5g_f6d970a0-c801-4472-a3b6-eccd8335d0a8/dnsmasq-dns/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.454926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8_23a67d89-596c-44f0-b19d-dc5d1eb3021e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.577088 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_84bb340d-f999-45fc-8e1c-d813e2ad4319/glance-httpd/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.590101 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_84bb340d-f999-45fc-8e1c-d813e2ad4319/glance-log/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.693189 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_99490f57-22ed-4652-a112-bf45feb67aee/glance-httpd/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.853514 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_99490f57-22ed-4652-a112-bf45feb67aee/glance-log/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.977749 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6_0a95e206-d7b9-49a5-8efd-7cab72e48d9d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.143878 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zmp82_fa0d4605-cd87-49b1-b17f-8c0e06590afd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.412050 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7cb5f76f4-hx5jh_32b3c2bb-2288-4e2e-a9c6-d19cfe651181/keystone-api/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.413390 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525521-9cbvt_973609f7-b4ce-41f2-ad80-83b1b1593e2f/keystone-cron/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.534284 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f8493c9f-328a-446d-8110-5879a7aedd2b/kube-state-metrics/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.690213 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2_a0d75723-6d9a-4609-a294-f179d1e84710/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.977619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8694c7b8f7-2td8g_a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b/neutron-api/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.035243 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8694c7b8f7-2td8g_a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b/neutron-httpd/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.217710 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf_ee896205-7724-47fe-9f87-f2efb9afa870/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.557358 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5aaacc6a-6882-467d-b66f-0178ccd35955/nova-api-log/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.698542 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5aaacc6a-6882-467d-b66f-0178ccd35955/nova-api-api/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.897113 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_69f96c80-f951-453b-9880-ecd0591dc1bf/nova-cell0-conductor-conductor/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.024360 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f7880856-0db7-4bbf-9202-04f90868fc1d/nova-cell1-conductor-conductor/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.292885 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_168eaa46-c907-452a-8537-3cea6b524360/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.554353 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cdks2_67f05b1f-f720-4b77-967c-2649fd05cb09/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.921776 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f2a647c-7a68-4e2c-aabf-b18973b20ad0/nova-metadata-log/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.274725 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6f1f5c9a-dacb-45b5-95bf-2e62a12a908b/nova-scheduler-scheduler/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.423953 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a07f9633-74f5-48e5-8467-d649fc49a2ff/mysql-bootstrap/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.568594 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a07f9633-74f5-48e5-8467-d649fc49a2ff/mysql-bootstrap/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.677831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a07f9633-74f5-48e5-8467-d649fc49a2ff/galera/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.698494 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f2a647c-7a68-4e2c-aabf-b18973b20ad0/nova-metadata-metadata/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.912383 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_53444e7f-4c1d-401b-9896-5ff9c4aab65a/mysql-bootstrap/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.183383 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_53444e7f-4c1d-401b-9896-5ff9c4aab65a/mysql-bootstrap/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.193988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_53444e7f-4c1d-401b-9896-5ff9c4aab65a/galera/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.339826 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_af557f35-ca9e-4990-bdcb-9e44366dab68/openstackclient/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.463452 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6tmmr_293cde43-7bcf-4638-a080-badb26c81138/ovn-controller/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.701817 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-282bs_9470e2b8-0f01-4735-8050-1bae363b3a02/openstack-network-exporter/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.859844 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovsdb-server-init/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.074823 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovs-vswitchd/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.112681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovsdb-server/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.121459 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovsdb-server-init/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.365983 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v5f89_9ff9829f-e8f9-4d78-9826-0385817cf2a4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.542676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6f8e6f58-f989-41f2-b8cb-c798405cfa33/ovn-northd/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.584533 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6f8e6f58-f989-41f2-b8cb-c798405cfa33/openstack-network-exporter/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.766730 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13228713-9349-4241-b1f7-67f9a2c705fa/openstack-network-exporter/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.819424 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13228713-9349-4241-b1f7-67f9a2c705fa/ovsdbserver-nb/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.184492 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_05a27e5a-189e-4d17-9823-d95ef7906a7b/openstack-network-exporter/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.296988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_05a27e5a-189e-4d17-9823-d95ef7906a7b/ovsdbserver-sb/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.410619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cc7c8879d-tnbfs_41b669ab-d733-4941-b134-b9ad19b38143/placement-api/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.499905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cc7c8879d-tnbfs_41b669ab-d733-4941-b134-b9ad19b38143/placement-log/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.660557 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/init-config-reloader/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.833109 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/init-config-reloader/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.866786 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/config-reloader/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.916089 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/prometheus/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.149059 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/thanos-sidecar/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.181077 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9ac0e00c-0e1d-40fa-802d-8a77ac4c842b/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.407796 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9ac0e00c-0e1d-40fa-802d-8a77ac4c842b/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.449968 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9ac0e00c-0e1d-40fa-802d-8a77ac4c842b/rabbitmq/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.646512 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9f14785b-2e99-4110-9523-78ec32490e71/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.847927 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9f14785b-2e99-4110-9523-78ec32490e71/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.892765 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9f14785b-2e99-4110-9523-78ec32490e71/rabbitmq/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.062042 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6_baff33d3-a587-4283-a861-38d88a47539e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.210242 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2hf52_d2554051-f8a8-413e-b352-13ac8f88da63/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.425831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx_78d0d06a-2199-4c5c-99e9-5bf916d8f30e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.531098 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-r6c9k_44ab5cbe-e4cd-4036-8768-104fcf0d8963/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.766988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rr66z_812efe23-7ca7-49b9-bd76-194a82c603b3/ssh-known-hosts-edpm-deployment/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.014012 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b7b95d7bc-zqb9x_42a3f824-28fe-4734-8ada-a74ffb9930a8/proxy-server/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.028555 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b7b95d7bc-zqb9x_42a3f824-28fe-4734-8ada-a74ffb9930a8/proxy-httpd/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.254668 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-q5fhk_c81edb08-7ac8-4cfc-abce-5895b8e7b59b/swift-ring-rebalance/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.386716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-auditor/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.478599 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-reaper/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.548657 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-replicator/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.704337 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-server/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.016237 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-auditor/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.128324 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-replicator/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.212843 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-server/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.305519 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-updater/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.493013 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-expirer/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.494463 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-auditor/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.628491 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-replicator/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.752644 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-server/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.831271 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-updater/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.923787 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/rsync/0.log" Feb 19 20:05:32 crc kubenswrapper[4722]: I0219 20:05:32.027065 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/swift-recon-cron/0.log" Feb 19 20:05:32 crc kubenswrapper[4722]: I0219 20:05:32.361947 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp_4a2c74da-6ac0-4070-9f5a-577bc5c64771/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:32 crc kubenswrapper[4722]: I0219 20:05:32.441955 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v_b51489f6-90e0-4a0d-ae54-24eb1e6f5568/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:34 crc kubenswrapper[4722]: I0219 20:05:34.477859 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_0e52a6ab-57a4-4fd1-bd50-1832e756fc7f/cloudkitty-proc/0.log" Feb 19 20:05:39 crc kubenswrapper[4722]: I0219 20:05:39.456905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_059950bd-4e60-42e6-a9c6-4e4ab0b039aa/memcached/0.log" Feb 19 20:05:41 crc kubenswrapper[4722]: I0219 20:05:41.798642 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:05:41 crc kubenswrapper[4722]: I0219 20:05:41.799332 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.969401 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:05:52 crc kubenswrapper[4722]: E0219 20:05:52.970389 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerName="container-00" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.970405 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerName="container-00" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.970628 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerName="container-00" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.972078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.978592 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.115238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.115422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.115475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.216857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217264 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217467 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.238682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.326401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.917936 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:05:54 crc kubenswrapper[4722]: I0219 20:05:54.210810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef"} Feb 19 20:05:54 crc kubenswrapper[4722]: I0219 20:05:54.211073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"00eb92e2f6c141e6b8c8c232efe8dff7c051cec833dd70ef1159394157c42a2e"} Feb 19 20:05:55 crc kubenswrapper[4722]: I0219 20:05:55.220956 4722 generic.go:334] "Generic (PLEG): container finished" podID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerID="59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef" exitCode=0 Feb 19 20:05:55 crc kubenswrapper[4722]: I0219 20:05:55.221036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef"} Feb 19 20:05:55 crc kubenswrapper[4722]: I0219 20:05:55.221391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb"} Feb 19 20:05:56 crc kubenswrapper[4722]: I0219 20:05:56.233805 4722 generic.go:334] "Generic (PLEG): container finished" podID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerID="ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb" exitCode=0 Feb 19 20:05:56 crc kubenswrapper[4722]: I0219 20:05:56.234002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb"} Feb 19 20:05:57 crc kubenswrapper[4722]: I0219 20:05:57.246274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29"} Feb 19 20:05:57 crc kubenswrapper[4722]: I0219 20:05:57.270848 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lv7l8" podStartSLOduration=2.621294528 podStartE2EDuration="5.27083348s" podCreationTimestamp="2026-02-19 20:05:52 +0000 UTC" firstStartedPulling="2026-02-19 20:05:54.21591752 +0000 UTC m=+2853.828267844" lastFinishedPulling="2026-02-19 20:05:56.865456472 +0000 UTC m=+2856.477806796" observedRunningTime="2026-02-19 20:05:57.2692181 +0000 UTC m=+2856.881568434" watchObservedRunningTime="2026-02-19 20:05:57.27083348 +0000 UTC m=+2856.883183794" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.446733 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/util/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.628194 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/util/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.636346 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/pull/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.857513 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/pull/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.987989 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/pull/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.005678 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/util/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.027344 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/extract/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.327922 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.328267 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.382769 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.580257 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-mc64t_edbe95e5-3a5d-4dec-9a94-509234857155/manager/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.947303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-hxv5g_baba09d1-2238-4ca1-98ee-f44938b68cd3/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.175982 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-qrsw8_019f7edd-1d9b-4069-a2a1-36bbe6b0a567/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.375832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.401909 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-hncxm_2c02c7e1-6f72-44be-a4fb-10ca1df420aa/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.439606 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.966981 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-x7bwr_b64009a1-83ef-4d66-bc6b-80ccfc6f7727/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.972439 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-rnh9h_c36983b4-b7f9-4834-85e9-a5c3cb83eb2d/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.046969 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-q5kgj_421f6539-4fcb-4949-ba29-34997fc98490/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.309448 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-x6wk7_db329f91-74f2-4baa-ab5a-85ad999fc8ef/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.526747 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-7qkx4_766eebc1-05fc-4ca0-8c75-276632a6597e/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.835767 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-8cljg_b37b04c7-5374-49d3-97c0-5b5b27c4a220/manager/0.log" Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.024238 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-6t7g6_57783601-5230-49ef-8ac2-0ddf78bd4b3a/manager/0.log" Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.376529 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lv7l8" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" containerID="cri-o://9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29" gracePeriod=2 Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.609928 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-wqp5t_64ff9a64-f79f-4a45-943d-36152964cfcd/manager/0.log" Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.834537 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9csntwh_8870a7b1-f894-4429-9f52-d9063fe9c780/manager/0.log" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.429814 4722 generic.go:334] "Generic (PLEG): container finished" podID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerID="9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29" exitCode=0 Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.430068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29"} Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.488063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6ddf4746f6-l927q_fb86a4c4-379d-4dcd-86c5-5ee95092e6c0/operator/0.log" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.555947 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.641506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.641833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.641955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.643408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities" (OuterVolumeSpecName: "utilities") pod "9af17da0-e01d-43d5-9a7d-d97b1ff552c6" (UID: "9af17da0-e01d-43d5-9a7d-d97b1ff552c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.649830 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.650809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw" (OuterVolumeSpecName: "kube-api-access-hmmkw") pod "9af17da0-e01d-43d5-9a7d-d97b1ff552c6" (UID: "9af17da0-e01d-43d5-9a7d-d97b1ff552c6"). InnerVolumeSpecName "kube-api-access-hmmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.712651 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9af17da0-e01d-43d5-9a7d-d97b1ff552c6" (UID: "9af17da0-e01d-43d5-9a7d-d97b1ff552c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.746350 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-knsfg_efd426b6-a53d-4127-ae59-e2f9aec632cc/registry-server/0.log" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.757354 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.757383 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.096416 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-6dlqc_738a1346-88e9-4c4e-b7ce-1878736e2493/manager/0.log" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.308250 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-mgzgq_820eede6-6396-4466-bf00-5d3b39d982d6/manager/0.log" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.442118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"00eb92e2f6c141e6b8c8c232efe8dff7c051cec833dd70ef1159394157c42a2e"} Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.442200 4722 scope.go:117] "RemoveContainer" containerID="9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.442374 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.485125 4722 scope.go:117] "RemoveContainer" containerID="ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.500110 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.514513 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.569615 4722 scope.go:117] "RemoveContainer" containerID="59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.594870 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pjv7d_65b17979-6c94-40e6-ac54-41a61a726e87/operator/0.log" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.872587 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-wktqn_29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.020753 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-zft4s_a6fb3554-24ea-4330-b2cb-1c91f105345d/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.104943 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" path="/var/lib/kubelet/pods/9af17da0-e01d-43d5-9a7d-d97b1ff552c6/volumes" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.299295 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-dbdmf_2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.707305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-zdfxj_f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.827350 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5484b6858b-7g48c_792a7a0a-a11e-42ce-a99b-e24127e7bbe8/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.881676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f8cf67456-vwhlj_12f061e0-51af-4ab9-a8a7-26b2775651e1/manager/0.log" Feb 19 20:06:11 crc kubenswrapper[4722]: I0219 20:06:11.797865 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:06:11 crc kubenswrapper[4722]: I0219 20:06:11.798216 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:06:11 crc kubenswrapper[4722]: I0219 20:06:11.958721 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-k5c54_0af2e6ef-277d-4022-b42b-5639b589fef9/manager/0.log" Feb 19 20:06:31 crc kubenswrapper[4722]: I0219 20:06:31.178607 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r4jmd_41fade82-0d8d-41b2-805e-8a92ffa97cf3/control-plane-machine-set-operator/0.log" Feb 19 20:06:31 crc kubenswrapper[4722]: I0219 20:06:31.356522 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-glfz9_bf8b7b84-382a-410f-8dea-c4f485402a77/kube-rbac-proxy/0.log" Feb 19 20:06:31 crc kubenswrapper[4722]: I0219 20:06:31.376555 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-glfz9_bf8b7b84-382a-410f-8dea-c4f485402a77/machine-api-operator/0.log" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.798592 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.799083 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.799126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.799954 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.800011 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9" gracePeriod=600 Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797244 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9" exitCode=0 Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9"} Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def"} Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797539 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:06:45 crc kubenswrapper[4722]: I0219 20:06:45.450590 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fz7bp_9545d522-f459-4b98-ac7f-d107189b7497/cert-manager-controller/0.log" Feb 19 20:06:45 crc kubenswrapper[4722]: I0219 20:06:45.644577 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-242s6_b1356eef-86bd-4fbf-beb6-a98cd8bc60b8/cert-manager-cainjector/0.log" Feb 19 20:06:45 crc kubenswrapper[4722]: I0219 20:06:45.755001 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hzrck_e49e50d8-05f3-42f4-a03a-f3a750e1a134/cert-manager-webhook/0.log" Feb 19 20:07:00 crc kubenswrapper[4722]: I0219 20:07:00.749347 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-nlx9v_ed131fa7-525a-481d-83a9-4fef817dc7ce/nmstate-console-plugin/0.log" Feb 19 20:07:00 crc kubenswrapper[4722]: I0219 20:07:00.952219 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tvslw_59139bb2-e1ae-4f74-96fe-6ea34d232cd9/nmstate-handler/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.048332 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-t5lsr_62ed738c-2401-4b21-b6a8-1bc2c1c009ae/kube-rbac-proxy/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.122010 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-t5lsr_62ed738c-2401-4b21-b6a8-1bc2c1c009ae/nmstate-metrics/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.199872 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-hclph_296e010f-202c-4c01-836e-be6c48607e5f/nmstate-operator/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.337260 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-9jmpv_f9185385-162a-40a7-9563-3c668080b9e9/nmstate-webhook/0.log" Feb 19 20:07:14 crc kubenswrapper[4722]: I0219 20:07:14.919674 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/kube-rbac-proxy/0.log" Feb 19 20:07:14 crc kubenswrapper[4722]: I0219 20:07:14.946815 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/manager/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.128366 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7lzn_572e9436-e389-4b1e-b86f-e13f14f8d3eb/prometheus-operator/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.322311 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_1577ee2f-abd8-4e61-9fd1-238960e8bdf6/prometheus-operator-admission-webhook/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.329484 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_cc8f56cb-a9d1-4b27-adca-40adf6902cc8/prometheus-operator-admission-webhook/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.503905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4qpbt_7f659845-54cc-4e5c-892c-a754900c1f39/perses-operator/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.520777 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8xtkk_68e6d18b-f149-46fb-ba46-8fb37d82712a/operator/0.log" Feb 19 20:07:42 crc kubenswrapper[4722]: I0219 20:07:42.704850 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-h9kn7_1a80711d-831e-42ab-a5f8-6272eba9c635/kube-rbac-proxy/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.053815 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-h9kn7_1a80711d-831e-42ab-a5f8-6272eba9c635/controller/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.117856 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.321510 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.401382 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.406100 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.415680 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.668475 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.670322 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.676852 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.733633 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.951165 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.951256 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.985133 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.005983 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/controller/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.141658 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/frr-metrics/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.227534 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/kube-rbac-proxy/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.335340 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/kube-rbac-proxy-frr/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.371571 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/reloader/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.547866 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-8nh6q_505e06e7-65a2-4444-8552-8b96253c87fc/frr-k8s-webhook-server/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.705409 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84788dc4db-d5shx_f41ca32e-24fc-427a-a2bc-76e4d5abba0f/manager/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.818014 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78b8d96b76-5d9t2_02eda63c-5131-407e-bb2e-7ad0adf0e985/webhook-server/0.log" Feb 19 20:07:45 crc kubenswrapper[4722]: I0219 20:07:45.009985 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nnmrq_d1319426-40ee-40fc-86bf-64cca26d6860/kube-rbac-proxy/0.log" Feb 19 20:07:45 crc kubenswrapper[4722]: I0219 20:07:45.444249 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nnmrq_d1319426-40ee-40fc-86bf-64cca26d6860/speaker/0.log" Feb 19 20:07:45 crc kubenswrapper[4722]: I0219 20:07:45.536022 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/frr/0.log" Feb 19 20:07:57 crc kubenswrapper[4722]: I0219 20:07:57.974594 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.181932 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.204668 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.216280 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.358678 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.376783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/extract/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.392394 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.538521 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.738181 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.739452 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.760968 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.925286 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.952946 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.963671 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/extract/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.097989 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/util/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.269241 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/pull/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.269412 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/pull/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.290086 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/util/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.469712 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/extract/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.489526 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/util/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.496873 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/pull/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.639999 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-utilities/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.825372 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-content/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.835453 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-utilities/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.864643 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.040967 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.040978 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.248747 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.521134 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/registry-server/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.542301 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.570627 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.615053 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.746555 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.772421 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.968066 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/util/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.216136 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/pull/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.277503 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/pull/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.300615 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/util/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.390621 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/registry-server/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.487476 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/util/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.505469 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/extract/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.532748 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/pull/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.559610 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lrwfz_6fb12d29-ac35-4e04-a25d-05b1b2545b81/marketplace-operator/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.723623 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-utilities/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.888104 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-utilities/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.888117 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-content/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.895931 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.111344 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.111373 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.141723 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.248691 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/registry-server/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.359246 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.360358 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.379811 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.536416 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.563661 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.994678 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/registry-server/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.051681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_cc8f56cb-a9d1-4b27-adca-40adf6902cc8/prometheus-operator-admission-webhook/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.051904 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_1577ee2f-abd8-4e61-9fd1-238960e8bdf6/prometheus-operator-admission-webhook/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.079676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7lzn_572e9436-e389-4b1e-b86f-e13f14f8d3eb/prometheus-operator/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.299771 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8xtkk_68e6d18b-f149-46fb-ba46-8fb37d82712a/operator/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.301523 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4qpbt_7f659845-54cc-4e5c-892c-a754900c1f39/perses-operator/0.log" Feb 19 20:08:27 crc kubenswrapper[4722]: I0219 20:08:27.897961 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/manager/0.log" Feb 19 20:08:27 crc kubenswrapper[4722]: I0219 20:08:27.920246 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/kube-rbac-proxy/0.log" Feb 19 20:09:11 crc kubenswrapper[4722]: I0219 20:09:11.798774 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:09:11 crc kubenswrapper[4722]: I0219 20:09:11.800977 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:09:41 crc kubenswrapper[4722]: I0219 20:09:41.798557 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:09:41 crc kubenswrapper[4722]: I0219 20:09:41.799445 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.369851 4722 generic.go:334] "Generic (PLEG): container finished" podID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" exitCode=0 Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.369961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerDied","Data":"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103"} Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.371608 4722 scope.go:117] "RemoveContainer" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.470924 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rtsd9_must-gather-h964l_71becbc5-18f8-4f0b-ad6d-a12d9846ac73/gather/0.log" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.784645 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.785456 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rtsd9/must-gather-h964l" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" containerID="cri-o://6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" gracePeriod=2 Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.799279 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.799529 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.799579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.800447 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.800515 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" gracePeriod=600 Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.801251 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:10:11 crc kubenswrapper[4722]: E0219 20:10:11.973340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.351736 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rtsd9_must-gather-h964l_71becbc5-18f8-4f0b-ad6d-a12d9846ac73/copy/0.log" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.352482 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.452188 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.452251 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.459524 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t" (OuterVolumeSpecName: "kube-api-access-m9n8t") pod "71becbc5-18f8-4f0b-ad6d-a12d9846ac73" (UID: "71becbc5-18f8-4f0b-ad6d-a12d9846ac73"). InnerVolumeSpecName "kube-api-access-m9n8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.474282 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" exitCode=0 Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.474356 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def"} Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.474407 4722 scope.go:117] "RemoveContainer" containerID="1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.475647 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:12 crc kubenswrapper[4722]: E0219 20:10:12.476448 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.478272 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rtsd9_must-gather-h964l_71becbc5-18f8-4f0b-ad6d-a12d9846ac73/copy/0.log" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.480692 4722 generic.go:334] "Generic (PLEG): container finished" podID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" exitCode=143 Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.480777 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.554790 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") on node \"crc\" DevicePath \"\"" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.555518 4722 scope.go:117] "RemoveContainer" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.608948 4722 scope.go:117] "RemoveContainer" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.656588 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "71becbc5-18f8-4f0b-ad6d-a12d9846ac73" (UID: "71becbc5-18f8-4f0b-ad6d-a12d9846ac73"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.658825 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.718146 4722 scope.go:117] "RemoveContainer" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" Feb 19 20:10:12 crc kubenswrapper[4722]: E0219 20:10:12.718677 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073\": container with ID starting with 6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073 not found: ID does not exist" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.718721 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073"} err="failed to get container status \"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073\": rpc error: code = NotFound desc = could not find container \"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073\": container with ID starting with 6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073 not found: ID does not exist" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.718750 4722 scope.go:117] "RemoveContainer" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:12 crc kubenswrapper[4722]: E0219 20:10:12.719309 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103\": container with ID starting with cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103 not found: ID does not exist" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.719353 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103"} err="failed to get container status \"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103\": rpc error: code = NotFound desc = could not find container \"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103\": container with ID starting with cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103 not found: ID does not exist" Feb 19 20:10:13 crc kubenswrapper[4722]: I0219 20:10:13.082990 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" path="/var/lib/kubelet/pods/71becbc5-18f8-4f0b-ad6d-a12d9846ac73/volumes" Feb 19 20:10:24 crc kubenswrapper[4722]: I0219 20:10:24.072082 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:24 crc kubenswrapper[4722]: E0219 20:10:24.073568 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.786727 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-8xtkk container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.793292 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.794767 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-8xtkk container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.794812 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:10:37 crc kubenswrapper[4722]: I0219 20:10:37.071505 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:37 crc kubenswrapper[4722]: E0219 20:10:37.072476 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:46 crc kubenswrapper[4722]: I0219 20:10:46.526140 4722 scope.go:117] "RemoveContainer" containerID="68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c" Feb 19 20:10:46 crc kubenswrapper[4722]: I0219 20:10:46.565929 4722 scope.go:117] "RemoveContainer" containerID="9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef" Feb 19 20:10:48 crc kubenswrapper[4722]: I0219 20:10:48.071859 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:48 crc kubenswrapper[4722]: E0219 20:10:48.072760 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:03 crc kubenswrapper[4722]: I0219 20:11:03.071531 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:03 crc kubenswrapper[4722]: E0219 20:11:03.072347 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124304 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124874 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-content" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124890 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-content" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124914 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-utilities" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124923 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-utilities" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124938 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124946 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124976 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124984 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124999 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="gather" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125009 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="gather" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125307 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125330 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125346 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="gather" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.127063 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.137878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.245924 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.245974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.246192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.347896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.371774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.466466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.990642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:05 crc kubenswrapper[4722]: W0219 20:11:05.997669 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8098abc5_9bf4_457d_8aff_7f23a653bb59.slice/crio-e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783 WatchSource:0}: Error finding container e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783: Status 404 returned error can't find the container with id e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783 Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.232674 4722 generic.go:334] "Generic (PLEG): container finished" podID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" exitCode=0 Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.232727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759"} Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.232757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerStarted","Data":"e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783"} Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.235485 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.241962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerStarted","Data":"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224"} Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.908410 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.912639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.924817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.996893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.997041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.997110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.099045 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.099969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.100489 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.100280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.099899 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.125582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.247377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: W0219 20:11:08.786801 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bce7cdd_fc85_4b5a_a7b3_c2b073897ad0.slice/crio-e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483 WatchSource:0}: Error finding container e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483: Status 404 returned error can't find the container with id e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483 Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.789622 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.265969 4722 generic.go:334] "Generic (PLEG): container finished" podID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" exitCode=0 Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.266064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224"} Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.269432 4722 generic.go:334] "Generic (PLEG): container finished" podID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerID="913a735795c14766116399446b635d98dcb30549bf7fd872ee753ae569a27a4a" exitCode=0 Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.269474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"913a735795c14766116399446b635d98dcb30549bf7fd872ee753ae569a27a4a"} Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.269505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerStarted","Data":"e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483"} Feb 19 20:11:10 crc kubenswrapper[4722]: I0219 20:11:10.282889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerStarted","Data":"d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628"} Feb 19 20:11:10 crc kubenswrapper[4722]: I0219 20:11:10.287114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerStarted","Data":"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828"} Feb 19 20:11:10 crc kubenswrapper[4722]: I0219 20:11:10.323055 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mz2kj" podStartSLOduration=1.853841251 podStartE2EDuration="5.323037393s" podCreationTimestamp="2026-02-19 20:11:05 +0000 UTC" firstStartedPulling="2026-02-19 20:11:06.235250093 +0000 UTC m=+3165.847600417" lastFinishedPulling="2026-02-19 20:11:09.704446225 +0000 UTC m=+3169.316796559" observedRunningTime="2026-02-19 20:11:10.319342047 +0000 UTC m=+3169.931692371" watchObservedRunningTime="2026-02-19 20:11:10.323037393 +0000 UTC m=+3169.935387717" Feb 19 20:11:11 crc kubenswrapper[4722]: I0219 20:11:11.298096 4722 generic.go:334] "Generic (PLEG): container finished" podID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerID="d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628" exitCode=0 Feb 19 20:11:11 crc kubenswrapper[4722]: I0219 20:11:11.298169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628"} Feb 19 20:11:12 crc kubenswrapper[4722]: I0219 20:11:12.309604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerStarted","Data":"22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762"} Feb 19 20:11:12 crc kubenswrapper[4722]: I0219 20:11:12.353276 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zbfjr" podStartSLOduration=2.66676097 podStartE2EDuration="5.353241991s" podCreationTimestamp="2026-02-19 20:11:07 +0000 UTC" firstStartedPulling="2026-02-19 20:11:09.271880144 +0000 UTC m=+3168.884230468" lastFinishedPulling="2026-02-19 20:11:11.958361165 +0000 UTC m=+3171.570711489" observedRunningTime="2026-02-19 20:11:12.333366521 +0000 UTC m=+3171.945716845" watchObservedRunningTime="2026-02-19 20:11:12.353241991 +0000 UTC m=+3171.965592355" Feb 19 20:11:15 crc kubenswrapper[4722]: I0219 20:11:15.466777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:15 crc kubenswrapper[4722]: I0219 20:11:15.467572 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:15 crc kubenswrapper[4722]: I0219 20:11:15.541659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:16 crc kubenswrapper[4722]: I0219 20:11:16.071845 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:16 crc kubenswrapper[4722]: E0219 20:11:16.072347 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:16 crc kubenswrapper[4722]: I0219 20:11:16.443480 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:17 crc kubenswrapper[4722]: I0219 20:11:17.498099 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.247834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.247886 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.309700 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.375548 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mz2kj" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" containerID="cri-o://2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" gracePeriod=2 Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.445963 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.014274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031003 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"8098abc5-9bf4-457d-8aff-7f23a653bb59\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"8098abc5-9bf4-457d-8aff-7f23a653bb59\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"8098abc5-9bf4-457d-8aff-7f23a653bb59\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities" (OuterVolumeSpecName: "utilities") pod "8098abc5-9bf4-457d-8aff-7f23a653bb59" (UID: "8098abc5-9bf4-457d-8aff-7f23a653bb59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.032694 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.043660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq" (OuterVolumeSpecName: "kube-api-access-nq9zq") pod "8098abc5-9bf4-457d-8aff-7f23a653bb59" (UID: "8098abc5-9bf4-457d-8aff-7f23a653bb59"). InnerVolumeSpecName "kube-api-access-nq9zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.096710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8098abc5-9bf4-457d-8aff-7f23a653bb59" (UID: "8098abc5-9bf4-457d-8aff-7f23a653bb59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.135334 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.135365 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389787 4722 generic.go:334] "Generic (PLEG): container finished" podID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" exitCode=0 Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828"} Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783"} Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389998 4722 scope.go:117] "RemoveContainer" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.412024 4722 scope.go:117] "RemoveContainer" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.438963 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.453981 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.456611 4722 scope.go:117] "RemoveContainer" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.506169 4722 scope.go:117] "RemoveContainer" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" Feb 19 20:11:19 crc kubenswrapper[4722]: E0219 20:11:19.506659 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828\": container with ID starting with 2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828 not found: ID does not exist" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.506700 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828"} err="failed to get container status \"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828\": rpc error: code = NotFound desc = could not find container \"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828\": container with ID starting with 2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828 not found: ID does not exist" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.506725 4722 scope.go:117] "RemoveContainer" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" Feb 19 20:11:19 crc kubenswrapper[4722]: E0219 20:11:19.507208 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224\": container with ID starting with 6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224 not found: ID does not exist" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.507260 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224"} err="failed to get container status \"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224\": rpc error: code = NotFound desc = could not find container \"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224\": container with ID starting with 6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224 not found: ID does not exist" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.507295 4722 scope.go:117] "RemoveContainer" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" Feb 19 20:11:19 crc kubenswrapper[4722]: E0219 20:11:19.507623 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759\": container with ID starting with e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759 not found: ID does not exist" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.507653 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759"} err="failed to get container status \"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759\": rpc error: code = NotFound desc = could not find container \"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759\": container with ID starting with e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759 not found: ID does not exist" Feb 19 20:11:21 crc kubenswrapper[4722]: I0219 20:11:21.102279 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" path="/var/lib/kubelet/pods/8098abc5-9bf4-457d-8aff-7f23a653bb59/volumes" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.112842 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.113582 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zbfjr" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" containerID="cri-o://22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762" gracePeriod=2 Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.442643 4722 generic.go:334] "Generic (PLEG): container finished" podID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerID="22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762" exitCode=0 Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.442739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762"} Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.619726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.735690 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.736028 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.736115 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.737550 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities" (OuterVolumeSpecName: "utilities") pod "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" (UID: "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.744242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt" (OuterVolumeSpecName: "kube-api-access-6zswt") pod "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" (UID: "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0"). InnerVolumeSpecName "kube-api-access-6zswt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.763824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" (UID: "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.839405 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.839785 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.839798 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.458266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483"} Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.458343 4722 scope.go:117] "RemoveContainer" containerID="22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.458351 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.510302 4722 scope.go:117] "RemoveContainer" containerID="d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.514731 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.525850 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.536983 4722 scope.go:117] "RemoveContainer" containerID="913a735795c14766116399446b635d98dcb30549bf7fd872ee753ae569a27a4a" Feb 19 20:11:25 crc kubenswrapper[4722]: I0219 20:11:25.087860 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" path="/var/lib/kubelet/pods/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0/volumes" Feb 19 20:11:31 crc kubenswrapper[4722]: I0219 20:11:31.077550 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:31 crc kubenswrapper[4722]: E0219 20:11:31.078362 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:42 crc kubenswrapper[4722]: I0219 20:11:42.071991 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:42 crc kubenswrapper[4722]: E0219 20:11:42.072918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.189888 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191070 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191082 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191093 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191099 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191110 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191116 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191131 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191137 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191185 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191192 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191207 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191213 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191400 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191414 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.192962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.204312 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.338960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.339027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.339279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.467786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.523442 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.991916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:11:48 crc kubenswrapper[4722]: I0219 20:11:48.732343 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" exitCode=0 Feb 19 20:11:48 crc kubenswrapper[4722]: I0219 20:11:48.732393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8"} Feb 19 20:11:48 crc kubenswrapper[4722]: I0219 20:11:48.732608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerStarted","Data":"56b15ecd82052d49803a12a739aa9fc81dc3ed1ffdb9822b4047322c5da5eefe"} Feb 19 20:11:49 crc kubenswrapper[4722]: I0219 20:11:49.745072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerStarted","Data":"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1"} Feb 19 20:11:53 crc kubenswrapper[4722]: I0219 20:11:53.790976 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" exitCode=0 Feb 19 20:11:53 crc kubenswrapper[4722]: I0219 20:11:53.791073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1"} Feb 19 20:11:54 crc kubenswrapper[4722]: I0219 20:11:54.072350 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:54 crc kubenswrapper[4722]: E0219 20:11:54.072900 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:54 crc kubenswrapper[4722]: I0219 20:11:54.808420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerStarted","Data":"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a"} Feb 19 20:11:54 crc kubenswrapper[4722]: I0219 20:11:54.832495 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7h79q" podStartSLOduration=2.365717994 podStartE2EDuration="7.832476727s" podCreationTimestamp="2026-02-19 20:11:47 +0000 UTC" firstStartedPulling="2026-02-19 20:11:48.734230195 +0000 UTC m=+3208.346580559" lastFinishedPulling="2026-02-19 20:11:54.200988958 +0000 UTC m=+3213.813339292" observedRunningTime="2026-02-19 20:11:54.827380549 +0000 UTC m=+3214.439730933" watchObservedRunningTime="2026-02-19 20:11:54.832476727 +0000 UTC m=+3214.444827051" Feb 19 20:11:57 crc kubenswrapper[4722]: I0219 20:11:57.525070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:57 crc kubenswrapper[4722]: I0219 20:11:57.525453 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:58 crc kubenswrapper[4722]: I0219 20:11:58.596251 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7h79q" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" probeResult="failure" output=< Feb 19 20:11:58 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 20:11:58 crc kubenswrapper[4722]: > Feb 19 20:12:07 crc kubenswrapper[4722]: I0219 20:12:07.590562 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:07 crc kubenswrapper[4722]: I0219 20:12:07.652245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:07 crc kubenswrapper[4722]: I0219 20:12:07.829986 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:12:08 crc kubenswrapper[4722]: I0219 20:12:08.071878 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:08 crc kubenswrapper[4722]: E0219 20:12:08.072215 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:12:08 crc kubenswrapper[4722]: I0219 20:12:08.955129 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7h79q" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" containerID="cri-o://1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" gracePeriod=2 Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.527864 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.607129 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"9ab54680-8998-4ed7-aa56-8196f18629c5\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.607470 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"9ab54680-8998-4ed7-aa56-8196f18629c5\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.607656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"9ab54680-8998-4ed7-aa56-8196f18629c5\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.608851 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities" (OuterVolumeSpecName: "utilities") pod "9ab54680-8998-4ed7-aa56-8196f18629c5" (UID: "9ab54680-8998-4ed7-aa56-8196f18629c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.611639 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.620317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv" (OuterVolumeSpecName: "kube-api-access-rcjpv") pod "9ab54680-8998-4ed7-aa56-8196f18629c5" (UID: "9ab54680-8998-4ed7-aa56-8196f18629c5"). InnerVolumeSpecName "kube-api-access-rcjpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.713594 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") on node \"crc\" DevicePath \"\"" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.749832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ab54680-8998-4ed7-aa56-8196f18629c5" (UID: "9ab54680-8998-4ed7-aa56-8196f18629c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.816206 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.968795 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" exitCode=0 Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.968863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a"} Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.969598 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"56b15ecd82052d49803a12a739aa9fc81dc3ed1ffdb9822b4047322c5da5eefe"} Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.968869 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.969639 4722 scope.go:117] "RemoveContainer" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.989270 4722 scope.go:117] "RemoveContainer" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.012569 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.025902 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.037875 4722 scope.go:117] "RemoveContainer" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.061756 4722 scope.go:117] "RemoveContainer" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" Feb 19 20:12:10 crc kubenswrapper[4722]: E0219 20:12:10.062266 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a\": container with ID starting with 1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a not found: ID does not exist" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062310 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a"} err="failed to get container status \"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a\": rpc error: code = NotFound desc = could not find container \"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a\": container with ID starting with 1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a not found: ID does not exist" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062335 4722 scope.go:117] "RemoveContainer" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" Feb 19 20:12:10 crc kubenswrapper[4722]: E0219 20:12:10.062653 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1\": container with ID starting with 64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1 not found: ID does not exist" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062702 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1"} err="failed to get container status \"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1\": rpc error: code = NotFound desc = could not find container \"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1\": container with ID starting with 64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1 not found: ID does not exist" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062734 4722 scope.go:117] "RemoveContainer" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" Feb 19 20:12:10 crc kubenswrapper[4722]: E0219 20:12:10.063102 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8\": container with ID starting with 217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8 not found: ID does not exist" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.063179 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8"} err="failed to get container status \"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8\": rpc error: code = NotFound desc = could not find container \"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8\": container with ID starting with 217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8 not found: ID does not exist" Feb 19 20:12:11 crc kubenswrapper[4722]: I0219 20:12:11.083424 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" path="/var/lib/kubelet/pods/9ab54680-8998-4ed7-aa56-8196f18629c5/volumes" Feb 19 20:12:23 crc kubenswrapper[4722]: I0219 20:12:23.071636 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:23 crc kubenswrapper[4722]: E0219 20:12:23.073430 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:12:36 crc kubenswrapper[4722]: I0219 20:12:36.071877 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:36 crc kubenswrapper[4722]: E0219 20:12:36.072589 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:12:47 crc kubenswrapper[4722]: I0219 20:12:47.076344 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:47 crc kubenswrapper[4722]: E0219 20:12:47.077548 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:00 crc kubenswrapper[4722]: I0219 20:13:00.072414 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:00 crc kubenswrapper[4722]: E0219 20:13:00.074133 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:15 crc kubenswrapper[4722]: I0219 20:13:15.071623 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:15 crc kubenswrapper[4722]: E0219 20:13:15.072497 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:28 crc kubenswrapper[4722]: I0219 20:13:28.077294 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:28 crc kubenswrapper[4722]: E0219 20:13:28.079963 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:42 crc kubenswrapper[4722]: I0219 20:13:42.071991 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:42 crc kubenswrapper[4722]: E0219 20:13:42.072656 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:54 crc kubenswrapper[4722]: I0219 20:13:54.071445 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:54 crc kubenswrapper[4722]: E0219 20:13:54.072276 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:09 crc kubenswrapper[4722]: I0219 20:14:09.071346 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:09 crc kubenswrapper[4722]: E0219 20:14:09.072072 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:20 crc kubenswrapper[4722]: I0219 20:14:20.071440 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:20 crc kubenswrapper[4722]: E0219 20:14:20.072414 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:33 crc kubenswrapper[4722]: I0219 20:14:33.071397 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:33 crc kubenswrapper[4722]: E0219 20:14:33.072441 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:45 crc kubenswrapper[4722]: I0219 20:14:45.072144 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:45 crc kubenswrapper[4722]: E0219 20:14:45.073207 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:59 crc kubenswrapper[4722]: I0219 20:14:59.080785 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:59 crc kubenswrapper[4722]: E0219 20:14:59.081807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.152356 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g"] Feb 19 20:15:00 crc kubenswrapper[4722]: E0219 20:15:00.153227 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-utilities" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153243 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-utilities" Feb 19 20:15:00 crc kubenswrapper[4722]: E0219 20:15:00.153261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153270 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4722]: E0219 20:15:00.153307 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-content" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153314 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-content" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153555 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.154518 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.156805 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.157111 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.168857 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g"] Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.314347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.314702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.314976 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.417627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.417810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.417978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.419202 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.425261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.438264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.504967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.988109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g"] Feb 19 20:15:01 crc kubenswrapper[4722]: I0219 20:15:01.893873 4722 generic.go:334] "Generic (PLEG): container finished" podID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerID="bd3698ac075c0a7914916910624265b1e5a98426ab24840ac159e552ca7e1514" exitCode=0 Feb 19 20:15:01 crc kubenswrapper[4722]: I0219 20:15:01.893967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" event={"ID":"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa","Type":"ContainerDied","Data":"bd3698ac075c0a7914916910624265b1e5a98426ab24840ac159e552ca7e1514"} Feb 19 20:15:01 crc kubenswrapper[4722]: I0219 20:15:01.894249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" event={"ID":"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa","Type":"ContainerStarted","Data":"3fb4aa9edf887cb0b6574567b0ca3a21b764ceb2a300394ec2dd282ee7979c71"} Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.393097 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.490787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.490877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.490931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.492092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume" (OuterVolumeSpecName: "config-volume") pod "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" (UID: "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.495840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6" (OuterVolumeSpecName: "kube-api-access-dsxt6") pod "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" (UID: "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa"). InnerVolumeSpecName "kube-api-access-dsxt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.502364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" (UID: "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.593036 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.593280 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.593361 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.918785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" event={"ID":"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa","Type":"ContainerDied","Data":"3fb4aa9edf887cb0b6574567b0ca3a21b764ceb2a300394ec2dd282ee7979c71"} Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.918832 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.918866 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb4aa9edf887cb0b6574567b0ca3a21b764ceb2a300394ec2dd282ee7979c71" Feb 19 20:15:04 crc kubenswrapper[4722]: I0219 20:15:04.498434 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 20:15:04 crc kubenswrapper[4722]: I0219 20:15:04.506913 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 20:15:05 crc kubenswrapper[4722]: I0219 20:15:05.092358 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" path="/var/lib/kubelet/pods/b6513190-cf4a-405f-a7ca-c35f37d63725/volumes" Feb 19 20:15:11 crc kubenswrapper[4722]: I0219 20:15:11.077960 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:15:11 crc kubenswrapper[4722]: E0219 20:15:11.078805 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:15:26 crc kubenswrapper[4722]: I0219 20:15:26.072920 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:15:27 crc kubenswrapper[4722]: I0219 20:15:27.157209 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"046e022c7f6b949833e8ff39bdeb53a1dc51d358998927287ef6b0062550383b"} Feb 19 20:15:46 crc kubenswrapper[4722]: I0219 20:15:46.807713 4722 scope.go:117] "RemoveContainer" containerID="1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.773507 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:20 crc kubenswrapper[4722]: E0219 20:16:20.774641 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerName="collect-profiles" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.774659 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerName="collect-profiles" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.774937 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerName="collect-profiles" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.777778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.817966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.886206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.886375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.886444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.988555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.988690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.988777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.989257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.989389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.010797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.138561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.646551 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:21 crc kubenswrapper[4722]: W0219 20:16:21.649087 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b92f9e9_b0bf_4870_a012_bcc485ce62c7.slice/crio-5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44 WatchSource:0}: Error finding container 5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44: Status 404 returned error can't find the container with id 5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44 Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.724940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerStarted","Data":"5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44"} Feb 19 20:16:22 crc kubenswrapper[4722]: I0219 20:16:22.734500 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" exitCode=0 Feb 19 20:16:22 crc kubenswrapper[4722]: I0219 20:16:22.734566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd"} Feb 19 20:16:22 crc kubenswrapper[4722]: I0219 20:16:22.737288 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:16:23 crc kubenswrapper[4722]: I0219 20:16:23.745267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerStarted","Data":"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb"} Feb 19 20:16:24 crc kubenswrapper[4722]: I0219 20:16:24.757604 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" exitCode=0 Feb 19 20:16:24 crc kubenswrapper[4722]: I0219 20:16:24.757650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb"} Feb 19 20:16:25 crc kubenswrapper[4722]: I0219 20:16:25.770068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerStarted","Data":"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23"} Feb 19 20:16:25 crc kubenswrapper[4722]: I0219 20:16:25.800297 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d872l" podStartSLOduration=3.318932277 podStartE2EDuration="5.800278603s" podCreationTimestamp="2026-02-19 20:16:20 +0000 UTC" firstStartedPulling="2026-02-19 20:16:22.736998811 +0000 UTC m=+3482.349349135" lastFinishedPulling="2026-02-19 20:16:25.218345137 +0000 UTC m=+3484.830695461" observedRunningTime="2026-02-19 20:16:25.789808917 +0000 UTC m=+3485.402159241" watchObservedRunningTime="2026-02-19 20:16:25.800278603 +0000 UTC m=+3485.412628927" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.140081 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.140739 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.191447 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.933372 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:32 crc kubenswrapper[4722]: I0219 20:16:32.002411 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:33 crc kubenswrapper[4722]: I0219 20:16:33.872890 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d872l" podUID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerName="registry-server" containerID="cri-o://9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" gracePeriod=2 Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.880178 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882631 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" exitCode=0 Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23"} Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44"} Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882723 4722 scope.go:117] "RemoveContainer" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.904089 4722 scope.go:117] "RemoveContainer" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.933592 4722 scope.go:117] "RemoveContainer" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.976837 4722 scope.go:117] "RemoveContainer" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" Feb 19 20:16:34 crc kubenswrapper[4722]: E0219 20:16:34.977297 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23\": container with ID starting with 9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23 not found: ID does not exist" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977339 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23"} err="failed to get container status \"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23\": rpc error: code = NotFound desc = could not find container \"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23\": container with ID starting with 9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23 not found: ID does not exist" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977364 4722 scope.go:117] "RemoveContainer" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" Feb 19 20:16:34 crc kubenswrapper[4722]: E0219 20:16:34.977599 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb\": container with ID starting with 49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb not found: ID does not exist" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977626 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb"} err="failed to get container status \"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb\": rpc error: code = NotFound desc = could not find container \"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb\": container with ID starting with 49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb not found: ID does not exist" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977643 4722 scope.go:117] "RemoveContainer" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" Feb 19 20:16:34 crc kubenswrapper[4722]: E0219 20:16:34.977884 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd\": container with ID starting with 09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd not found: ID does not exist" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977908 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd"} err="failed to get container status \"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd\": rpc error: code = NotFound desc = could not find container \"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd\": container with ID starting with 09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd not found: ID does not exist" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.991480 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.991763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.991835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.992965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities" (OuterVolumeSpecName: "utilities") pod "8b92f9e9-b0bf-4870-a012-bcc485ce62c7" (UID: "8b92f9e9-b0bf-4870-a012-bcc485ce62c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.004002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv" (OuterVolumeSpecName: "kube-api-access-c74zv") pod "8b92f9e9-b0bf-4870-a012-bcc485ce62c7" (UID: "8b92f9e9-b0bf-4870-a012-bcc485ce62c7"). InnerVolumeSpecName "kube-api-access-c74zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.048066 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b92f9e9-b0bf-4870-a012-bcc485ce62c7" (UID: "8b92f9e9-b0bf-4870-a012-bcc485ce62c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.094030 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.094058 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.094068 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.892391 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.917140 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.930321 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:37 crc kubenswrapper[4722]: I0219 20:16:37.097478 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" path="/var/lib/kubelet/pods/8b92f9e9-b0bf-4870-a012-bcc485ce62c7/volumes"